[ 487.115266] env[60714]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 487.770357] env[60764]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 489.107168] env[60764]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=60764) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 489.107489] env[60764]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=60764) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 489.107599] env[60764]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=60764) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 489.107900] env[60764]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 489.309632] env[60764]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=60764) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} [ 489.320046] env[60764]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=60764) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} [ 489.424967] env[60764]: INFO nova.virt.driver [None req-ff020f8a-f487-4bfd-bc74-f4b2bf0220f6 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 489.498186] env[60764]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 489.498382] env[60764]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 489.498451] env[60764]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=60764) __init__ /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:242}} [ 492.327786] env[60764]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-716677b9-125a-4421-b121-e8e7a11f687c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 492.344565] env[60764]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=60764) _create_session /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:242}} [ 492.344719] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-5041e9b8-c34c-46ae-b23f-643fa78371f6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 492.369554] env[60764]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 90792. [ 492.369683] env[60764]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 2.871s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 492.370233] env[60764]: INFO nova.virt.vmwareapi.driver [None req-ff020f8a-f487-4bfd-bc74-f4b2bf0220f6 None None] VMware vCenter version: 7.0.3 [ 492.373638] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c7ec392-bcc0-4162-8897-432c539c4418 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 492.391303] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffe24a18-9fb9-40b2-9151-4b2da33a0e24 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 492.397381] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86b5bdd4-c8a9-4016-af14-dda0cc6492a4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 492.404098] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fc1490a-4be6-42b7-8f55-9faa06c4568c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 492.417094] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7b1988e-40dc-4948-a6c9-27486776496f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 492.422957] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4fbde2f-f42b-4299-890b-ea1c695843fc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 492.453561] env[60764]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-6b042d4f-9f11-43ba-954d-ccf8104a65db {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 492.458699] env[60764]: DEBUG nova.virt.vmwareapi.driver [None req-ff020f8a-f487-4bfd-bc74-f4b2bf0220f6 None None] Extension org.openstack.compute already exists. {{(pid=60764) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:224}} [ 492.461293] env[60764]: INFO nova.compute.provider_config [None req-ff020f8a-f487-4bfd-bc74-f4b2bf0220f6 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 492.480177] env[60764]: DEBUG nova.context [None req-ff020f8a-f487-4bfd-bc74-f4b2bf0220f6 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),f99c1a1e-e016-4393-98d5-5a9785540605(cell1) {{(pid=60764) load_cells /opt/stack/nova/nova/context.py:464}} [ 492.482139] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 492.482370] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 492.483101] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 492.483530] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] Acquiring lock "f99c1a1e-e016-4393-98d5-5a9785540605" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 492.483721] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] Lock "f99c1a1e-e016-4393-98d5-5a9785540605" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 492.484736] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] Lock "f99c1a1e-e016-4393-98d5-5a9785540605" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 492.509432] env[60764]: INFO dbcounter [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] Registered counter for database nova_cell0 [ 492.517892] env[60764]: INFO dbcounter [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] Registered counter for database nova_cell1 [ 492.520859] env[60764]: DEBUG oslo_db.sqlalchemy.engines [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=60764) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 492.521242] env[60764]: DEBUG oslo_db.sqlalchemy.engines [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=60764) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 492.525725] env[60764]: DEBUG dbcounter [-] [60764] Writer thread running {{(pid=60764) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 492.526460] env[60764]: DEBUG dbcounter [-] [60764] Writer thread running {{(pid=60764) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 492.528528] env[60764]: ERROR nova.db.main.api [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 492.528528] env[60764]: result = function(*args, **kwargs) [ 492.528528] env[60764]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 492.528528] env[60764]: return func(*args, **kwargs) [ 492.528528] env[60764]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 492.528528] env[60764]: result = fn(*args, **kwargs) [ 492.528528] env[60764]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 492.528528] env[60764]: return f(*args, **kwargs) [ 492.528528] env[60764]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 492.528528] env[60764]: return db.service_get_minimum_version(context, binaries) [ 492.528528] env[60764]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 492.528528] env[60764]: _check_db_access() [ 492.528528] env[60764]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 492.528528] env[60764]: stacktrace = ''.join(traceback.format_stack()) [ 492.528528] env[60764]: [ 492.529559] env[60764]: ERROR nova.db.main.api [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 492.529559] env[60764]: result = function(*args, **kwargs) [ 492.529559] env[60764]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 492.529559] env[60764]: return func(*args, **kwargs) [ 492.529559] env[60764]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 492.529559] env[60764]: result = fn(*args, **kwargs) [ 492.529559] env[60764]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 492.529559] env[60764]: return f(*args, **kwargs) [ 492.529559] env[60764]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 492.529559] env[60764]: return db.service_get_minimum_version(context, binaries) [ 492.529559] env[60764]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 492.529559] env[60764]: _check_db_access() [ 492.529559] env[60764]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 492.529559] env[60764]: stacktrace = ''.join(traceback.format_stack()) [ 492.529559] env[60764]: [ 492.529923] env[60764]: WARNING nova.objects.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 492.530057] env[60764]: WARNING nova.objects.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] Failed to get minimum service version for cell f99c1a1e-e016-4393-98d5-5a9785540605 [ 492.530488] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] Acquiring lock "singleton_lock" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 492.530652] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] Acquired lock "singleton_lock" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 492.530892] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] Releasing lock "singleton_lock" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 492.531228] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] Full set of CONF: {{(pid=60764) _wait_for_exit_or_signal /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/service.py:362}} [ 492.531373] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ******************************************************************************** {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2589}} [ 492.531501] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] Configuration options gathered from: {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2590}} [ 492.531628] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2591}} [ 492.531810] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2592}} [ 492.531955] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ================================================================================ {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2594}} [ 492.532175] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] allow_resize_to_same_host = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.532353] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] arq_binding_timeout = 300 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.532484] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] backdoor_port = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.532608] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] backdoor_socket = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.532768] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] block_device_allocate_retries = 60 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.532931] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] block_device_allocate_retries_interval = 3 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.533111] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cert = self.pem {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.533278] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.533441] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] compute_monitors = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.533604] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] config_dir = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.533768] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] config_drive_format = iso9660 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.533899] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.534101] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] config_source = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.534272] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] console_host = devstack {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.534437] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] control_exchange = nova {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.534592] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cpu_allocation_ratio = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.534746] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] daemon = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.534910] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] debug = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.535075] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] default_access_ip_network_name = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.535241] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] default_availability_zone = nova {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.535407] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] default_ephemeral_format = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.535571] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] default_green_pool_size = 1000 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.535809] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.535968] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] default_schedule_zone = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.536150] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] disk_allocation_ratio = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.536330] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] enable_new_services = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.536509] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] enabled_apis = ['osapi_compute'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.536669] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] enabled_ssl_apis = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.536825] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] flat_injected = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.536978] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] force_config_drive = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.537147] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] force_raw_images = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.537315] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] graceful_shutdown_timeout = 5 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.537473] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] heal_instance_info_cache_interval = 60 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.537689] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] host = cpu-1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.537856] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.538023] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] initial_disk_allocation_ratio = 1.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.538185] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] initial_ram_allocation_ratio = 1.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.538391] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.538548] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] instance_build_timeout = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.538703] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] instance_delete_interval = 300 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.538867] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] instance_format = [instance: %(uuid)s] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.539045] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] instance_name_template = instance-%08x {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.539212] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] instance_usage_audit = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.539379] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] instance_usage_audit_period = month {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.539540] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.539702] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] instances_path = /opt/stack/data/nova/instances {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.539863] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] internal_service_availability_zone = internal {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.540024] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] key = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.540186] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] live_migration_retry_count = 30 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.540347] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] log_config_append = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.540510] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.540665] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] log_dir = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.540820] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] log_file = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.540944] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] log_options = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.541114] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] log_rotate_interval = 1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.541283] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] log_rotate_interval_type = days {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.541444] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] log_rotation_type = none {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.541604] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.541691] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.541853] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.542099] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.542210] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.542380] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] long_rpc_timeout = 1800 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.542542] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] max_concurrent_builds = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.542697] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] max_concurrent_live_migrations = 1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.542851] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] max_concurrent_snapshots = 5 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.543013] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] max_local_block_devices = 3 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.543177] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] max_logfile_count = 30 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.543335] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] max_logfile_size_mb = 200 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.543491] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] maximum_instance_delete_attempts = 5 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.543654] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] metadata_listen = 0.0.0.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.543817] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] metadata_listen_port = 8775 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.543979] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] metadata_workers = 2 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.544179] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] migrate_max_retries = -1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.544353] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] mkisofs_cmd = genisoimage {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.544556] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] my_block_storage_ip = 10.180.1.21 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.544689] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] my_ip = 10.180.1.21 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.544849] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] network_allocate_retries = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.545039] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.545210] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] osapi_compute_listen = 0.0.0.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.545371] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] osapi_compute_listen_port = 8774 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.545535] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] osapi_compute_unique_server_name_scope = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.545701] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] osapi_compute_workers = 2 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.545859] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] password_length = 12 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.546030] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] periodic_enable = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.546196] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] periodic_fuzzy_delay = 60 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.546366] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] pointer_model = usbtablet {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.546599] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] preallocate_images = none {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.546778] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] publish_errors = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.546908] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] pybasedir = /opt/stack/nova {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.547076] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ram_allocation_ratio = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.547242] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] rate_limit_burst = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.547407] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] rate_limit_except_level = CRITICAL {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.547564] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] rate_limit_interval = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.547721] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] reboot_timeout = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.547877] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] reclaim_instance_interval = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.548040] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] record = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.548213] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] reimage_timeout_per_gb = 60 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.548378] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] report_interval = 120 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.548538] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] rescue_timeout = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.548694] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] reserved_host_cpus = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.548852] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] reserved_host_disk_mb = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.549014] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] reserved_host_memory_mb = 512 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.549177] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] reserved_huge_pages = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.549340] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] resize_confirm_window = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.549498] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] resize_fs_using_block_device = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.549656] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] resume_guests_state_on_host_boot = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.549820] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.550815] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] rpc_response_timeout = 60 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.550815] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] run_external_periodic_tasks = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.550815] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] running_deleted_instance_action = reap {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.550815] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] running_deleted_instance_poll_interval = 1800 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.550815] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] running_deleted_instance_timeout = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.550815] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] scheduler_instance_sync_interval = 120 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.551267] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] service_down_time = 720 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.551267] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] servicegroup_driver = db {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.551267] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] shelved_offload_time = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.551665] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] shelved_poll_interval = 3600 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.551665] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] shutdown_timeout = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.551749] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] source_is_ipv6 = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.551864] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ssl_only = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.552132] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.552308] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] sync_power_state_interval = 600 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.552468] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] sync_power_state_pool_size = 1000 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.554316] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] syslog_log_facility = LOG_USER {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.554316] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] tempdir = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.554316] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] timeout_nbd = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.554316] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] transport_url = **** {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.554316] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] update_resources_interval = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.554316] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] use_cow_images = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.554891] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] use_eventlog = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.554891] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] use_journal = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.554891] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] use_json = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.554891] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] use_rootwrap_daemon = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.554891] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] use_stderr = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.554891] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] use_syslog = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.555227] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vcpu_pin_set = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.555227] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vif_plugging_is_fatal = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.555227] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vif_plugging_timeout = 300 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.555227] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] virt_mkfs = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.555227] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] volume_usage_poll_interval = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.555436] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] watch_log_file = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.555436] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] web = /usr/share/spice-html5 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 492.555630] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_concurrency.disable_process_locking = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.555930] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.556128] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.556302] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.556475] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.556639] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.556802] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.556982] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.auth_strategy = keystone {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.557164] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.compute_link_prefix = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.557342] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.557517] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.dhcp_domain = novalocal {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.557684] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.enable_instance_password = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.557847] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.glance_link_prefix = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.558027] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.558192] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.558355] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.instance_list_per_project_cells = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.558547] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.list_records_by_skipping_down_cells = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.558723] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.local_metadata_per_cell = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.558895] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.max_limit = 1000 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.559076] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.metadata_cache_expiration = 15 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.559257] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.neutron_default_tenant_id = default {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.559424] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.use_forwarded_for = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.559588] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.use_neutron_default_nets = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.559754] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.559915] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.560093] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.560270] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.560443] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.vendordata_dynamic_targets = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.560605] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.vendordata_jsonfile_path = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.560787] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.560977] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.backend = dogpile.cache.memcached {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.561159] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.backend_argument = **** {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.561332] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.config_prefix = cache.oslo {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.561497] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.dead_timeout = 60.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.561659] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.debug_cache_backend = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.561818] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.enable_retry_client = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.562029] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.enable_socket_keepalive = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.562224] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.enabled = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.562400] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.expiration_time = 600 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.562563] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.hashclient_retry_attempts = 2 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.562729] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.hashclient_retry_delay = 1.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.562892] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.memcache_dead_retry = 300 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.563072] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.memcache_password = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.563240] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.563402] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.563563] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.memcache_pool_maxsize = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.563724] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.563883] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.memcache_sasl_enabled = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.564089] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.564273] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.memcache_socket_timeout = 1.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.564446] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.memcache_username = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.564614] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.proxies = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.564781] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.retry_attempts = 2 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.564947] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.retry_delay = 0.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.565149] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.socket_keepalive_count = 1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.565295] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.socket_keepalive_idle = 1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.565457] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.socket_keepalive_interval = 1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.565617] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.tls_allowed_ciphers = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.565772] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.tls_cafile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.565927] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.tls_certfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.566102] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.tls_enabled = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.566265] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cache.tls_keyfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.566433] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cinder.auth_section = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.566606] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cinder.auth_type = password {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.566766] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cinder.cafile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.566939] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cinder.catalog_info = volumev3::publicURL {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.567110] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cinder.certfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.567275] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cinder.collect_timing = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.567435] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cinder.cross_az_attach = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.567598] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cinder.debug = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.567756] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cinder.endpoint_template = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.567917] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cinder.http_retries = 3 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.568088] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cinder.insecure = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.568253] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cinder.keyfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.568427] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cinder.os_region_name = RegionOne {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.568589] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cinder.split_loggers = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.568749] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cinder.timeout = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.568920] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.569089] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] compute.cpu_dedicated_set = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.569249] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] compute.cpu_shared_set = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.569411] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] compute.image_type_exclude_list = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.569572] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.569731] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] compute.max_concurrent_disk_ops = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.569890] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] compute.max_disk_devices_to_attach = -1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.570065] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.570241] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.570676] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] compute.resource_provider_association_refresh = 300 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.570676] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] compute.shutdown_retry_interval = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.570777] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.570949] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] conductor.workers = 2 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.571142] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] console.allowed_origins = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.571307] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] console.ssl_ciphers = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.571476] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] console.ssl_minimum_version = default {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.571646] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] consoleauth.token_ttl = 600 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.571819] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.cafile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.571996] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.certfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.572191] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.collect_timing = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.572355] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.connect_retries = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.572511] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.connect_retry_delay = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.572667] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.endpoint_override = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.572827] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.insecure = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.572984] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.keyfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.573163] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.max_version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.573318] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.min_version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.573471] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.region_name = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.573626] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.service_name = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.573793] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.service_type = accelerator {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.573952] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.split_loggers = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.574155] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.status_code_retries = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.574327] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.status_code_retry_delay = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.574485] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.timeout = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.574666] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.574825] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] cyborg.version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.575019] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.backend = sqlalchemy {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.575201] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.connection = **** {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.575374] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.connection_debug = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.575545] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.connection_parameters = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.575709] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.connection_recycle_time = 3600 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.575876] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.connection_trace = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.576051] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.db_inc_retry_interval = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.576220] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.db_max_retries = 20 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.576382] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.db_max_retry_interval = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.576544] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.db_retry_interval = 1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.576712] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.max_overflow = 50 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.576875] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.max_pool_size = 5 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.577050] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.max_retries = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.577225] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.577389] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.mysql_wsrep_sync_wait = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.577551] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.pool_timeout = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.577715] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.retry_interval = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.578776] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.slave_connection = **** {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.578776] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.sqlite_synchronous = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.578776] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] database.use_db_reconnect = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.578776] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.backend = sqlalchemy {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.578776] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.connection = **** {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.578776] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.connection_debug = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.578989] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.connection_parameters = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.579048] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.connection_recycle_time = 3600 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.579206] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.connection_trace = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.579373] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.db_inc_retry_interval = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.579535] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.db_max_retries = 20 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.579701] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.db_max_retry_interval = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.579853] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.db_retry_interval = 1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.580030] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.max_overflow = 50 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.580203] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.max_pool_size = 5 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.580403] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.max_retries = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.580580] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.580741] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.580903] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.pool_timeout = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.581084] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.retry_interval = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.581249] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.slave_connection = **** {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.581415] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] api_database.sqlite_synchronous = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.581588] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] devices.enabled_mdev_types = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.581762] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.581929] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ephemeral_storage_encryption.enabled = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.582136] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.582314] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.api_servers = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.582475] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.cafile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.582669] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.certfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.582837] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.collect_timing = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.582994] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.connect_retries = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.583168] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.connect_retry_delay = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.583331] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.debug = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.583494] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.default_trusted_certificate_ids = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.583653] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.enable_certificate_validation = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.583811] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.enable_rbd_download = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.583967] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.endpoint_override = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.584168] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.insecure = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.584341] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.keyfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.584500] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.max_version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.584655] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.min_version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.584816] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.num_retries = 3 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.584982] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.rbd_ceph_conf = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.585159] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.rbd_connect_timeout = 5 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.585326] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.rbd_pool = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.585489] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.rbd_user = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.585645] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.region_name = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.585799] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.service_name = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.585962] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.service_type = image {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.586136] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.split_loggers = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.586296] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.status_code_retries = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.586453] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.status_code_retry_delay = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.586607] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.timeout = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.586784] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.586946] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.verify_glance_signatures = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.587116] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] glance.version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.587285] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] guestfs.debug = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.587452] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] hyperv.config_drive_cdrom = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.587615] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] hyperv.config_drive_inject_password = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.587779] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.587942] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] hyperv.enable_instance_metrics_collection = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.588118] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] hyperv.enable_remotefx = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.588292] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] hyperv.instances_path_share = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.588457] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] hyperv.iscsi_initiator_list = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.588617] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] hyperv.limit_cpu_features = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.588777] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.588937] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.589107] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] hyperv.power_state_check_timeframe = 60 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.589275] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.589445] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.589605] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] hyperv.use_multipath_io = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.589767] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] hyperv.volume_attach_retry_count = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.589925] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.590094] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] hyperv.vswitch_name = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.590261] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.590426] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] mks.enabled = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.590791] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.590981] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] image_cache.manager_interval = 2400 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.591164] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] image_cache.precache_concurrency = 1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.591334] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] image_cache.remove_unused_base_images = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.591502] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.591667] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.591840] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] image_cache.subdirectory_name = _base {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.592053] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.api_max_retries = 60 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.592241] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.api_retry_interval = 2 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.592405] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.auth_section = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.592566] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.auth_type = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.592729] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.cafile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.592878] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.certfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.593047] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.collect_timing = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.593214] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.conductor_group = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.593372] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.connect_retries = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.593529] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.connect_retry_delay = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.593683] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.endpoint_override = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.593843] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.insecure = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.593997] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.keyfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.594194] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.max_version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.594355] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.min_version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.594517] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.peer_list = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.594705] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.region_name = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.594875] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.serial_console_state_timeout = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.595043] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.service_name = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.595222] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.service_type = baremetal {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.595384] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.split_loggers = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.595540] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.status_code_retries = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.595696] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.status_code_retry_delay = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.595852] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.timeout = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.596040] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.596208] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ironic.version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.596391] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.596564] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] key_manager.fixed_key = **** {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.596745] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.596905] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican.barbican_api_version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.597074] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican.barbican_endpoint = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.597250] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican.barbican_endpoint_type = public {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.597409] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican.barbican_region_name = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.597564] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican.cafile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.597720] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican.certfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.597881] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican.collect_timing = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.598051] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican.insecure = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.598215] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican.keyfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.598377] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican.number_of_retries = 60 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.598537] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican.retry_delay = 1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.598697] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican.send_service_user_token = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.598858] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican.split_loggers = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.599029] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican.timeout = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.599196] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican.verify_ssl = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.599357] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican.verify_ssl_path = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.599522] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican_service_user.auth_section = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.599680] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican_service_user.auth_type = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.599841] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican_service_user.cafile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.599989] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican_service_user.certfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.600164] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican_service_user.collect_timing = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.600325] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican_service_user.insecure = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.600481] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican_service_user.keyfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.600641] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican_service_user.split_loggers = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.600795] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] barbican_service_user.timeout = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.600960] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vault.approle_role_id = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.601131] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vault.approle_secret_id = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.601290] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vault.cafile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.601447] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vault.certfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.601608] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vault.collect_timing = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.601765] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vault.insecure = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.601924] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vault.keyfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.602166] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vault.kv_mountpoint = secret {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.602349] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vault.kv_path = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.602517] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vault.kv_version = 2 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.602674] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vault.namespace = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.602837] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vault.root_token_id = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.602987] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vault.split_loggers = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.603160] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vault.ssl_ca_crt_file = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.603321] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vault.timeout = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.603479] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vault.use_ssl = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.603645] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.603810] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.auth_section = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.603970] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.auth_type = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.604164] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.cafile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.604335] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.certfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.604498] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.collect_timing = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.604655] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.connect_retries = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.604814] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.connect_retry_delay = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.604968] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.endpoint_override = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.605141] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.insecure = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.605300] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.keyfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.605455] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.max_version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.605607] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.min_version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.605759] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.region_name = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.605911] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.service_name = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.606089] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.service_type = identity {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.606256] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.split_loggers = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.606411] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.status_code_retries = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.606566] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.status_code_retry_delay = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.606756] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.timeout = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.606942] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.607117] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] keystone.version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.607321] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.connection_uri = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.607481] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.cpu_mode = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.607646] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.cpu_model_extra_flags = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.607813] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.cpu_models = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.607982] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.cpu_power_governor_high = performance {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.608163] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.cpu_power_governor_low = powersave {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.608330] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.cpu_power_management = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.608535] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.608682] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.device_detach_attempts = 8 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.608858] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.device_detach_timeout = 20 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.608975] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.disk_cachemodes = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.609149] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.disk_prefix = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.609318] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.enabled_perf_events = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.609481] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.file_backed_memory = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.609641] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.gid_maps = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.609796] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.hw_disk_discard = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.609950] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.hw_machine_type = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.610129] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.images_rbd_ceph_conf = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.610295] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.610459] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.610624] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.images_rbd_glance_store_name = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.610793] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.images_rbd_pool = rbd {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.610959] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.images_type = default {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.611128] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.images_volume_group = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.611289] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.inject_key = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.611444] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.inject_partition = -2 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.611600] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.inject_password = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.611758] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.iscsi_iface = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.611914] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.iser_use_multipath = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.612131] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.live_migration_bandwidth = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.612307] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.612470] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.live_migration_downtime = 500 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.612630] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.612791] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.612946] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.live_migration_inbound_addr = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.613120] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.613284] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.live_migration_permit_post_copy = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.613441] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.live_migration_scheme = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.613610] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.live_migration_timeout_action = abort {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.613770] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.live_migration_tunnelled = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.613922] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.live_migration_uri = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.614109] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.live_migration_with_native_tls = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.614281] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.max_queues = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.614443] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.614598] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.nfs_mount_options = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.614913] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.615101] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.615276] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.num_iser_scan_tries = 5 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.615432] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.num_memory_encrypted_guests = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.615591] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.615752] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.num_pcie_ports = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.615915] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.num_volume_scan_tries = 5 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.616089] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.pmem_namespaces = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.616255] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.quobyte_client_cfg = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.616535] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.616702] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.rbd_connect_timeout = 5 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.616864] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.617034] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.617200] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.rbd_secret_uuid = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.617356] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.rbd_user = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.617515] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.617684] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.remote_filesystem_transport = ssh {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.617840] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.rescue_image_id = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.617994] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.rescue_kernel_id = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.618162] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.rescue_ramdisk_id = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.618332] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.618483] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.rx_queue_size = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.618663] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.smbfs_mount_options = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.618954] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.619141] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.snapshot_compression = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.619304] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.snapshot_image_format = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.619518] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.619683] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.sparse_logical_volumes = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.619844] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.swtpm_enabled = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.620015] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.swtpm_group = tss {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.620182] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.swtpm_user = tss {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.620353] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.sysinfo_serial = unique {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.620513] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.tb_cache_size = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.620669] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.tx_queue_size = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.620832] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.uid_maps = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.620993] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.use_virtio_for_bridges = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.621175] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.virt_type = kvm {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.621347] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.volume_clear = zero {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.621506] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.volume_clear_size = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.621670] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.volume_use_multipath = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.621827] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.vzstorage_cache_path = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.622014] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.622229] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.vzstorage_mount_group = qemu {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.622405] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.vzstorage_mount_opts = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.622576] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.622847] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.623066] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.vzstorage_mount_user = stack {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.623200] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.623374] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.auth_section = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.623546] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.auth_type = password {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.623703] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.cafile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.623858] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.certfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.624030] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.collect_timing = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.624220] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.connect_retries = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.624384] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.connect_retry_delay = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.624551] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.default_floating_pool = public {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.624705] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.endpoint_override = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.624863] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.extension_sync_interval = 600 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.625029] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.http_retries = 3 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.625194] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.insecure = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.625350] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.keyfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.625502] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.max_version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.625667] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.625820] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.min_version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.625983] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.ovs_bridge = br-int {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.626158] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.physnets = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.626328] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.region_name = RegionOne {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.626493] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.service_metadata_proxy = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.626647] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.service_name = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.626808] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.service_type = network {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.626967] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.split_loggers = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.627134] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.status_code_retries = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.627293] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.status_code_retry_delay = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.627448] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.timeout = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.627623] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.627780] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] neutron.version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.627945] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] notifications.bdms_in_notifications = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.628130] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] notifications.default_level = INFO {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.628308] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] notifications.notification_format = unversioned {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.628470] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] notifications.notify_on_state_change = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.628642] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.628816] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] pci.alias = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.628979] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] pci.device_spec = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.629158] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] pci.report_in_placement = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.629333] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.auth_section = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.629503] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.auth_type = password {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.629665] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.629821] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.cafile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.629974] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.certfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.630145] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.collect_timing = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.630304] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.connect_retries = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.630459] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.connect_retry_delay = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.630612] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.default_domain_id = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.630796] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.default_domain_name = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.630965] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.domain_id = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.631136] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.domain_name = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.631294] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.endpoint_override = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.631453] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.insecure = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.631606] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.keyfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.631758] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.max_version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.631913] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.min_version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.632127] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.password = **** {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.632296] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.project_domain_id = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.632465] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.project_domain_name = Default {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.632626] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.project_id = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.632795] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.project_name = service {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.632962] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.region_name = RegionOne {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.633182] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.service_name = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.633302] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.service_type = placement {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.633465] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.split_loggers = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.633623] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.status_code_retries = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.633781] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.status_code_retry_delay = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.633937] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.system_scope = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.634131] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.timeout = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.634306] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.trust_id = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.634464] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.user_domain_id = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.634631] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.user_domain_name = Default {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.634789] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.user_id = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.634958] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.username = placement {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.635157] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.635320] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] placement.version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.635494] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] quota.cores = 20 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.635656] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] quota.count_usage_from_placement = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.635825] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.635990] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] quota.injected_file_content_bytes = 10240 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.636172] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] quota.injected_file_path_length = 255 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.636339] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] quota.injected_files = 5 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.636502] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] quota.instances = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.636663] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] quota.key_pairs = 100 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.636824] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] quota.metadata_items = 128 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.636987] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] quota.ram = 51200 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.637162] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] quota.recheck_quota = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.637329] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] quota.server_group_members = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.637495] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] quota.server_groups = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.637657] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] rdp.enabled = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.637968] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.638164] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.638332] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.638493] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] scheduler.image_metadata_prefilter = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.638654] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.638813] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] scheduler.max_attempts = 3 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.638972] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] scheduler.max_placement_results = 1000 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.639148] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.639311] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] scheduler.query_placement_for_image_type_support = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.639467] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.639636] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] scheduler.workers = 2 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.639803] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.639970] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.640157] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.640330] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.640491] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.640653] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.640812] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.640994] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.641172] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.host_subset_size = 1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.641339] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.641495] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.641656] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.641820] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.isolated_hosts = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.642011] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.isolated_images = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.642218] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.642388] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.642551] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.num_instances_weight_multiplier = 0.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.642713] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.pci_in_placement = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.642874] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.643041] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.643215] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.643377] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.643538] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.643697] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.643857] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.track_instance_changes = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.644064] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.644250] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] metrics.required = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.644418] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] metrics.weight_multiplier = 1.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.644581] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.644744] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] metrics.weight_setting = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.645058] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.645240] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] serial_console.enabled = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.645415] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] serial_console.port_range = 10000:20000 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.645587] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.645754] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.645921] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] serial_console.serialproxy_port = 6083 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.646097] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] service_user.auth_section = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.646278] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] service_user.auth_type = password {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.646438] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] service_user.cafile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.646594] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] service_user.certfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.646755] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] service_user.collect_timing = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.646913] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] service_user.insecure = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.647079] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] service_user.keyfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.647272] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] service_user.send_service_user_token = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.647434] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] service_user.split_loggers = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.647644] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] service_user.timeout = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.647849] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] spice.agent_enabled = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.648029] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] spice.enabled = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.648333] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.648531] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.648706] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] spice.html5proxy_port = 6082 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.648868] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] spice.image_compression = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.649035] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] spice.jpeg_compression = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.649201] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] spice.playback_compression = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.649368] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] spice.server_listen = 127.0.0.1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.649536] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.649694] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] spice.streaming_mode = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.649852] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] spice.zlib_compression = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.650026] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] upgrade_levels.baseapi = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.650189] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] upgrade_levels.cert = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.650361] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] upgrade_levels.compute = auto {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.650518] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] upgrade_levels.conductor = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.650674] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] upgrade_levels.scheduler = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.650837] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vendordata_dynamic_auth.auth_section = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.650993] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vendordata_dynamic_auth.auth_type = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.651162] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vendordata_dynamic_auth.cafile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.651320] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vendordata_dynamic_auth.certfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.651479] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.651635] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vendordata_dynamic_auth.insecure = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.651787] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vendordata_dynamic_auth.keyfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.651959] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.652164] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vendordata_dynamic_auth.timeout = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.652349] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.api_retry_count = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.652509] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.ca_file = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.652679] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.cache_prefix = devstack-image-cache {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.652846] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.cluster_name = testcl1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.653018] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.connection_pool_size = 10 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.653183] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.console_delay_seconds = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.653351] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.datastore_regex = ^datastore.* {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.653559] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.653725] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.host_password = **** {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.653890] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.host_port = 443 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.654086] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.host_username = administrator@vsphere.local {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.654290] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.insecure = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.654455] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.integration_bridge = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.654618] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.maximum_objects = 100 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.654773] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.pbm_default_policy = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.654930] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.pbm_enabled = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.655101] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.pbm_wsdl_location = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.655276] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.655433] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.serial_port_proxy_uri = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.655589] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.serial_port_service_uri = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.655750] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.task_poll_interval = 0.5 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.655921] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.use_linked_clone = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.656101] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.vnc_keymap = en-us {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.656271] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.vnc_port = 5900 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.656432] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vmware.vnc_port_total = 10000 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.656616] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vnc.auth_schemes = ['none'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.656790] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vnc.enabled = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.657096] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.657287] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.657459] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vnc.novncproxy_port = 6080 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.657634] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vnc.server_listen = 127.0.0.1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.657804] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.657960] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vnc.vencrypt_ca_certs = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.658128] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vnc.vencrypt_client_cert = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.658286] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vnc.vencrypt_client_key = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.658458] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.658617] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.disable_deep_image_inspection = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.658772] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.658928] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.659096] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.659262] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.disable_rootwrap = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.659420] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.enable_numa_live_migration = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.659577] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.659734] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.659890] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.660058] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.libvirt_disable_apic = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.660221] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.660382] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.660536] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.660693] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.660848] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.661008] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.661175] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.661334] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.661490] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.661652] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.661831] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.662040] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] wsgi.client_socket_timeout = 900 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.662235] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] wsgi.default_pool_size = 1000 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.662405] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] wsgi.keep_alive = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.662568] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] wsgi.max_header_line = 16384 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.662725] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] wsgi.secure_proxy_ssl_header = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.662882] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] wsgi.ssl_ca_file = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.663072] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] wsgi.ssl_cert_file = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.663207] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] wsgi.ssl_key_file = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.663372] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] wsgi.tcp_keepidle = 600 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.663567] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.663703] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] zvm.ca_file = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.663859] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] zvm.cloud_connector_url = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.664190] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.664369] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] zvm.reachable_timeout = 300 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.664549] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_policy.enforce_new_defaults = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.664716] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_policy.enforce_scope = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.664888] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_policy.policy_default_rule = default {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.665074] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.665254] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_policy.policy_file = policy.yaml {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.665420] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.665578] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.665735] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.665889] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.666061] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.666262] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.666446] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.666623] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] profiler.connection_string = messaging:// {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.666788] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] profiler.enabled = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.666955] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] profiler.es_doc_type = notification {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.667131] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] profiler.es_scroll_size = 10000 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.667303] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] profiler.es_scroll_time = 2m {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.667463] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] profiler.filter_error_trace = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.667629] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] profiler.hmac_keys = **** {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.667791] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] profiler.sentinel_service_name = mymaster {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.667951] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] profiler.socket_timeout = 0.1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.668127] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] profiler.trace_requests = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.668302] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] profiler.trace_sqlalchemy = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.668471] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] profiler_jaeger.process_tags = {} {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.668626] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] profiler_jaeger.service_name_prefix = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.668785] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] profiler_otlp.service_name_prefix = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.668945] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] remote_debug.host = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.669117] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] remote_debug.port = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.669298] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.669461] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.669623] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.669781] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.669939] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.670109] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.670272] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.670430] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.670588] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.670744] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.670910] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.671086] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.671258] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.671421] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.671580] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.671749] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.671924] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.672128] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.672311] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.672477] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.672638] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.672804] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.672962] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.673168] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.673301] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.673468] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.ssl = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.673677] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.673802] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.673961] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.674172] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.674349] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_rabbit.ssl_version = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.674537] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.674704] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_notifications.retry = -1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.674887] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.675075] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_messaging_notifications.transport_url = **** {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.675252] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.auth_section = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.675413] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.auth_type = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.675569] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.cafile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.675734] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.certfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.675895] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.collect_timing = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.676149] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.connect_retries = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.676225] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.connect_retry_delay = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.676369] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.endpoint_id = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.676526] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.endpoint_override = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.676685] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.insecure = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.676837] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.keyfile = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.676991] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.max_version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.677166] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.min_version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.677323] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.region_name = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.677474] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.service_name = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.677625] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.service_type = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.677781] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.split_loggers = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.677931] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.status_code_retries = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.678098] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.status_code_retry_delay = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.678285] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.timeout = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.678450] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.valid_interfaces = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.678628] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_limit.version = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.678796] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_reports.file_event_handler = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.678959] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.679129] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] oslo_reports.log_dir = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.679299] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.679456] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.679611] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.679771] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.679931] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.680098] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.680269] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.680424] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vif_plug_ovs_privileged.group = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.680577] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.680738] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.680898] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.681062] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] vif_plug_ovs_privileged.user = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.681233] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] os_vif_linux_bridge.flat_interface = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.681407] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.681575] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.681741] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.681907] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.682124] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.682307] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.682470] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.682644] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] os_vif_ovs.default_qos_type = linux-noop {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.682811] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] os_vif_ovs.isolate_vif = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.682978] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.683155] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.683327] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.683491] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] os_vif_ovs.ovsdb_interface = native {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.683648] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] os_vif_ovs.per_port_bridge = False {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.683807] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] os_brick.lock_path = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.683965] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.684168] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] os_brick.wait_mpath_device_interval = 1 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.684348] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] privsep_osbrick.capabilities = [21] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.684507] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] privsep_osbrick.group = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.684663] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] privsep_osbrick.helper_command = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.684822] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.684981] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.685173] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] privsep_osbrick.user = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.685349] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.685509] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] nova_sys_admin.group = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.685658] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] nova_sys_admin.helper_command = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.685818] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.685976] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.686145] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] nova_sys_admin.user = None {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 492.686275] env[60764]: DEBUG oslo_service.service [None req-40844749-dfda-43c7-b16e-0d1505cc3546 None None] ******************************************************************************** {{(pid=60764) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2613}} [ 492.686695] env[60764]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 492.696750] env[60764]: WARNING nova.virt.vmwareapi.driver [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] The vmwareapi driver is not tested by the OpenStack project nor does it have clear maintainer(s) and thus its quality can not be ensured. It should be considered experimental and may be removed in a future release. If you are using the driver in production please let us know via the openstack-discuss mailing list. [ 492.697208] env[60764]: INFO nova.virt.node [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Generated node identity 67a94047-1c18-43e8-9b47-05a1d30bcca4 [ 492.697437] env[60764]: INFO nova.virt.node [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Wrote node identity 67a94047-1c18-43e8-9b47-05a1d30bcca4 to /opt/stack/data/n-cpu-1/compute_id [ 492.710344] env[60764]: WARNING nova.compute.manager [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Compute nodes ['67a94047-1c18-43e8-9b47-05a1d30bcca4'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 492.742899] env[60764]: INFO nova.compute.manager [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 492.763656] env[60764]: WARNING nova.compute.manager [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 492.763868] env[60764]: DEBUG oslo_concurrency.lockutils [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 492.764143] env[60764]: DEBUG oslo_concurrency.lockutils [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 492.764306] env[60764]: DEBUG oslo_concurrency.lockutils [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 492.764459] env[60764]: DEBUG nova.compute.resource_tracker [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 492.765621] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5cc7999-4d4d-4ced-b3e1-f933fd919a25 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 492.774272] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7c7c746-187e-4712-9c15-cbd69c8a31ec {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 492.787967] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ee04714-14d6-4095-a511-1ff16fb0ca5c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 492.794172] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d204f931-2827-4f4c-bc28-d6e5fb6abb8c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 492.823896] env[60764]: DEBUG nova.compute.resource_tracker [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181279MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 492.824098] env[60764]: DEBUG oslo_concurrency.lockutils [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 492.824268] env[60764]: DEBUG oslo_concurrency.lockutils [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 492.836161] env[60764]: WARNING nova.compute.resource_tracker [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] No compute node record for cpu-1:67a94047-1c18-43e8-9b47-05a1d30bcca4: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 67a94047-1c18-43e8-9b47-05a1d30bcca4 could not be found. [ 492.849672] env[60764]: INFO nova.compute.resource_tracker [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 67a94047-1c18-43e8-9b47-05a1d30bcca4 [ 492.902127] env[60764]: DEBUG nova.compute.resource_tracker [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 492.902296] env[60764]: DEBUG nova.compute.resource_tracker [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 493.012312] env[60764]: INFO nova.scheduler.client.report [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] [req-ed2de3eb-666e-4827-ab5b-dd63735501ec] Created resource provider record via placement API for resource provider with UUID 67a94047-1c18-43e8-9b47-05a1d30bcca4 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 493.030185] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-329e134d-1a4e-42ee-a4b4-eaa1f0bd14fe {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 493.038422] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25f94bbe-fe5a-45aa-a662-75141d8d762b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 493.067283] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e5f6ace-2071-4712-b308-d7cedd789b08 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 493.074480] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aefff290-c422-4e23-a2ad-90ae14f31f6c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 493.087591] env[60764]: DEBUG nova.compute.provider_tree [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Updating inventory in ProviderTree for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 493.128543] env[60764]: DEBUG nova.scheduler.client.report [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Updated inventory for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 493.128765] env[60764]: DEBUG nova.compute.provider_tree [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Updating resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 generation from 0 to 1 during operation: update_inventory {{(pid=60764) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 493.128907] env[60764]: DEBUG nova.compute.provider_tree [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Updating inventory in ProviderTree for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 493.176671] env[60764]: DEBUG nova.compute.provider_tree [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Updating resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 generation from 1 to 2 during operation: update_traits {{(pid=60764) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 493.193870] env[60764]: DEBUG nova.compute.resource_tracker [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 493.194124] env[60764]: DEBUG oslo_concurrency.lockutils [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.370s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 493.194317] env[60764]: DEBUG nova.service [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Creating RPC server for service compute {{(pid=60764) start /opt/stack/nova/nova/service.py:182}} [ 493.207830] env[60764]: DEBUG nova.service [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] Join ServiceGroup membership for this service compute {{(pid=60764) start /opt/stack/nova/nova/service.py:199}} [ 493.208024] env[60764]: DEBUG nova.servicegroup.drivers.db [None req-4b8f0f0c-706d-49a0-8621-50f57046fb2f None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=60764) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 496.211015] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 496.221441] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Getting list of instances from cluster (obj){ [ 496.221441] env[60764]: value = "domain-c8" [ 496.221441] env[60764]: _type = "ClusterComputeResource" [ 496.221441] env[60764]: } {{(pid=60764) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 496.222596] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ed5c65a-d5b0-439e-be2a-73c42289eb20 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 496.231855] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Got total of 0 instances {{(pid=60764) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 496.232085] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 496.232380] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Getting list of instances from cluster (obj){ [ 496.232380] env[60764]: value = "domain-c8" [ 496.232380] env[60764]: _type = "ClusterComputeResource" [ 496.232380] env[60764]: } {{(pid=60764) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 496.233230] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3bf68f4-3805-4f1f-9e64-53dc64543978 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 496.240896] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Got total of 0 instances {{(pid=60764) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 502.527371] env[60764]: DEBUG dbcounter [-] [60764] Writing DB stats nova_cell0:SELECT=1 {{(pid=60764) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 502.528092] env[60764]: DEBUG dbcounter [-] [60764] Writing DB stats nova_cell1:SELECT=1 {{(pid=60764) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 534.507437] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Acquiring lock "116f360a-6080-46c7-8234-69fe54b9a147" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 534.507791] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Lock "116f360a-6080-46c7-8234-69fe54b9a147" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 534.535476] env[60764]: DEBUG nova.compute.manager [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 534.645626] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 534.645626] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 534.647113] env[60764]: INFO nova.compute.claims [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 534.773066] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cf8955e-e561-435f-a543-0c74c0ba1cd4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 534.784988] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50150945-303e-42ca-a1a4-092d258d8762 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 534.821915] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96d71d9d-a83c-4d83-8588-8f36546c796b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 534.830163] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d44b9c61-aa13-4a62-9c0f-99d9f0ff9691 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 534.843629] env[60764]: DEBUG nova.compute.provider_tree [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 534.852769] env[60764]: DEBUG nova.scheduler.client.report [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 534.875308] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.227s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 534.875308] env[60764]: DEBUG nova.compute.manager [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 534.923902] env[60764]: DEBUG nova.compute.utils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 534.928574] env[60764]: DEBUG nova.compute.manager [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 534.928574] env[60764]: DEBUG nova.network.neutron [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 534.942874] env[60764]: DEBUG nova.compute.manager [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 535.047649] env[60764]: DEBUG nova.compute.manager [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 535.558200] env[60764]: DEBUG nova.virt.hardware [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 535.558457] env[60764]: DEBUG nova.virt.hardware [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 535.559542] env[60764]: DEBUG nova.virt.hardware [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 535.560045] env[60764]: DEBUG nova.virt.hardware [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 535.560682] env[60764]: DEBUG nova.virt.hardware [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 535.560880] env[60764]: DEBUG nova.virt.hardware [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 535.562261] env[60764]: DEBUG nova.virt.hardware [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 535.562261] env[60764]: DEBUG nova.virt.hardware [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 535.562261] env[60764]: DEBUG nova.virt.hardware [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 535.562261] env[60764]: DEBUG nova.virt.hardware [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 535.562555] env[60764]: DEBUG nova.virt.hardware [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 535.563611] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51336923-651c-45e4-8923-557206240dcc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 535.575639] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cd07a07-c076-4686-b4fa-a8f1a448157f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 535.601123] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c92ad324-18af-4f0e-a0e1-79cc4934f17c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 535.764761] env[60764]: DEBUG nova.policy [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '69076ffeee5d4d43b3c6632f01d6c703', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd75a81583a3b4cd58793514fba83cf70', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 536.145153] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Acquiring lock "e60a6397-30ad-48cb-ab52-7ae977615dc3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 536.145381] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Lock "e60a6397-30ad-48cb-ab52-7ae977615dc3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 536.173733] env[60764]: DEBUG nova.compute.manager [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 536.270883] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 536.271159] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 536.273883] env[60764]: INFO nova.compute.claims [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 536.436984] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db9c550c-79b3-4161-a1f1-1f479c00d058 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.447470] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29cbcbe0-a941-453a-9ad9-7dc9e7d6420d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.485973] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c196cf84-4205-48ab-a686-c8df70054b98 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.494672] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d53589d-afe0-42cd-9a67-931cd422121a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.509966] env[60764]: DEBUG nova.compute.provider_tree [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 536.526238] env[60764]: DEBUG nova.scheduler.client.report [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 536.549675] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 536.550375] env[60764]: DEBUG nova.compute.manager [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 536.610598] env[60764]: DEBUG nova.compute.utils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 536.612238] env[60764]: DEBUG nova.compute.manager [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 536.616019] env[60764]: DEBUG nova.network.neutron [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 536.630767] env[60764]: DEBUG nova.compute.manager [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 536.708550] env[60764]: DEBUG nova.compute.manager [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 536.738089] env[60764]: DEBUG nova.virt.hardware [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 536.738333] env[60764]: DEBUG nova.virt.hardware [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 536.738481] env[60764]: DEBUG nova.virt.hardware [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 536.738678] env[60764]: DEBUG nova.virt.hardware [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 536.738990] env[60764]: DEBUG nova.virt.hardware [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 536.738990] env[60764]: DEBUG nova.virt.hardware [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 536.739263] env[60764]: DEBUG nova.virt.hardware [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 536.739426] env[60764]: DEBUG nova.virt.hardware [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 536.739585] env[60764]: DEBUG nova.virt.hardware [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 536.739871] env[60764]: DEBUG nova.virt.hardware [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 536.739977] env[60764]: DEBUG nova.virt.hardware [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 536.741041] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57e9317e-8b80-4624-aa4c-d064815aa3ae {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.749830] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4edb5659-bd21-4f94-8fe0-b7847157f264 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.809996] env[60764]: DEBUG nova.policy [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a4b667b7bbee40f99e1f659a4f04145f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '03c71129d932454192e0f3c1d7ffecf1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 536.909328] env[60764]: DEBUG nova.network.neutron [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Successfully created port: 4ddcb2ae-f096-4657-86bb-31627f621630 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 537.761037] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Acquiring lock "2f530484-d828-4b65-a81e-c1c1a84ec903" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 537.761295] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Lock "2f530484-d828-4b65-a81e-c1c1a84ec903" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 537.775172] env[60764]: DEBUG nova.compute.manager [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 537.853691] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 537.853691] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 537.854305] env[60764]: INFO nova.compute.claims [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 537.886386] env[60764]: DEBUG nova.network.neutron [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Successfully created port: 0add5ef6-4484-411b-b5f7-9e1be9acf8a2 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 538.035548] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f957ab89-8d13-41f1-9533-9f4817bae74b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 538.048674] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e9e3147-6ff5-4fa3-b340-49d7de3a04d0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 538.098181] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1658b079-18f8-4e9f-bb1f-2b0b661ed8d8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 538.114295] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9ccdc4e-ab64-40e4-909c-51d629b34c32 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 538.140884] env[60764]: DEBUG nova.compute.provider_tree [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 538.160679] env[60764]: DEBUG nova.scheduler.client.report [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 538.196220] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.344s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 538.197401] env[60764]: DEBUG nova.compute.manager [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 538.268266] env[60764]: DEBUG nova.compute.utils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 538.272046] env[60764]: DEBUG nova.compute.manager [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 538.272101] env[60764]: DEBUG nova.network.neutron [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 538.294177] env[60764]: DEBUG nova.compute.manager [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 538.396843] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Acquiring lock "673185d0-9e2c-49dc-8323-f8b30a65b59d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 538.397806] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Lock "673185d0-9e2c-49dc-8323-f8b30a65b59d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 538.400635] env[60764]: DEBUG nova.compute.manager [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 538.413218] env[60764]: DEBUG nova.compute.manager [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 538.438807] env[60764]: DEBUG nova.virt.hardware [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 538.438807] env[60764]: DEBUG nova.virt.hardware [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 538.438807] env[60764]: DEBUG nova.virt.hardware [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 538.439435] env[60764]: DEBUG nova.virt.hardware [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 538.439435] env[60764]: DEBUG nova.virt.hardware [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 538.439435] env[60764]: DEBUG nova.virt.hardware [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 538.439562] env[60764]: DEBUG nova.virt.hardware [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 538.439652] env[60764]: DEBUG nova.virt.hardware [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 538.439833] env[60764]: DEBUG nova.virt.hardware [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 538.440013] env[60764]: DEBUG nova.virt.hardware [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 538.440440] env[60764]: DEBUG nova.virt.hardware [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 538.441209] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-338d6d93-a336-4139-aea7-b81dc307135e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 538.451498] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d63aeee4-9483-418a-b2e3-f55beec38454 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 538.517049] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 538.517049] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 538.518405] env[60764]: INFO nova.compute.claims [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 538.593891] env[60764]: DEBUG nova.policy [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd1b5aee01174493e99ab079a023ab922', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1a9ba07dcf124300ba7f9740e3da1a4c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 538.689825] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef339707-c578-44bb-b089-361d42186f36 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 538.698690] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f67b680-534d-4bed-b041-af5b72d16899 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 538.734218] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-340ed0ab-2391-417e-8f95-9db54fee9563 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 538.742727] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-147f849a-1c3a-4bf2-81f4-506c1214369d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 538.766293] env[60764]: DEBUG nova.compute.provider_tree [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 538.783685] env[60764]: DEBUG nova.scheduler.client.report [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 538.815725] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.299s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 538.816107] env[60764]: DEBUG nova.compute.manager [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 538.876347] env[60764]: DEBUG nova.compute.utils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 538.879018] env[60764]: DEBUG nova.compute.manager [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 538.879205] env[60764]: DEBUG nova.network.neutron [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 538.904024] env[60764]: DEBUG nova.compute.manager [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 539.005733] env[60764]: DEBUG nova.compute.manager [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 539.039187] env[60764]: DEBUG nova.virt.hardware [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 539.039559] env[60764]: DEBUG nova.virt.hardware [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 539.039667] env[60764]: DEBUG nova.virt.hardware [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 539.040514] env[60764]: DEBUG nova.virt.hardware [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 539.040761] env[60764]: DEBUG nova.virt.hardware [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 539.040935] env[60764]: DEBUG nova.virt.hardware [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 539.041134] env[60764]: DEBUG nova.virt.hardware [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 539.041319] env[60764]: DEBUG nova.virt.hardware [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 539.041568] env[60764]: DEBUG nova.virt.hardware [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 539.041672] env[60764]: DEBUG nova.virt.hardware [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 539.041950] env[60764]: DEBUG nova.virt.hardware [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 539.043062] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fd8c6c0-c0b2-47b8-92e8-ea03fd563cc8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 539.057629] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76d0be9c-af79-4461-a8a3-ff104aba6610 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 539.077717] env[60764]: DEBUG nova.policy [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef145bdd3bd445869b7a396af80cd032', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '596ed3da89b84f9d8fcf5d9e94002377', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 539.501331] env[60764]: DEBUG nova.network.neutron [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Successfully updated port: 4ddcb2ae-f096-4657-86bb-31627f621630 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 539.528255] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Acquiring lock "refresh_cache-116f360a-6080-46c7-8234-69fe54b9a147" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 539.528255] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Acquired lock "refresh_cache-116f360a-6080-46c7-8234-69fe54b9a147" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 539.528255] env[60764]: DEBUG nova.network.neutron [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 539.664059] env[60764]: DEBUG nova.network.neutron [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 540.255298] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Acquiring lock "2696525a-3366-45a1-b413-8e4e0bd9d6c6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 540.255617] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Lock "2696525a-3366-45a1-b413-8e4e0bd9d6c6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 540.274122] env[60764]: DEBUG nova.compute.manager [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 540.366237] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 540.366237] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 540.368044] env[60764]: INFO nova.compute.claims [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 540.382549] env[60764]: DEBUG nova.network.neutron [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Successfully created port: 6d2b1df4-c829-4a84-95b1-2d1cf877b993 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 540.500424] env[60764]: DEBUG nova.network.neutron [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Successfully created port: 8fd04950-d805-4008-8a6d-bceca7b98edc {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 540.600713] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd27e614-822c-416c-964f-7ca5e9114dc1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 540.613413] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37e8c3c5-4d4a-4eea-a124-16b6e1aa5113 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 540.657993] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7acfb9f2-a16a-4bee-9caa-bc2a5a18599a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 540.668487] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e376a172-d4d0-42c9-9138-e199661e3dd8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 540.683906] env[60764]: DEBUG nova.compute.provider_tree [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 540.696988] env[60764]: DEBUG nova.scheduler.client.report [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 540.726620] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.362s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 540.727176] env[60764]: DEBUG nova.compute.manager [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 540.775575] env[60764]: DEBUG nova.compute.utils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 540.776848] env[60764]: DEBUG nova.compute.manager [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 540.777027] env[60764]: DEBUG nova.network.neutron [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 540.792783] env[60764]: DEBUG nova.compute.manager [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 540.815129] env[60764]: DEBUG nova.network.neutron [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Updating instance_info_cache with network_info: [{"id": "4ddcb2ae-f096-4657-86bb-31627f621630", "address": "fa:16:3e:b6:c7:a7", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.254", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4ddcb2ae-f0", "ovs_interfaceid": "4ddcb2ae-f096-4657-86bb-31627f621630", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 540.838836] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Releasing lock "refresh_cache-116f360a-6080-46c7-8234-69fe54b9a147" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 540.839182] env[60764]: DEBUG nova.compute.manager [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Instance network_info: |[{"id": "4ddcb2ae-f096-4657-86bb-31627f621630", "address": "fa:16:3e:b6:c7:a7", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.254", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4ddcb2ae-f0", "ovs_interfaceid": "4ddcb2ae-f096-4657-86bb-31627f621630", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 540.839996] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b6:c7:a7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9f87a752-ebb0-49a4-a67b-e356fa45b89b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4ddcb2ae-f096-4657-86bb-31627f621630', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 540.857018] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 540.857018] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b7cf4e8c-5050-45e3-90ee-d67261fdd3c5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 540.870611] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Created folder: OpenStack in parent group-v4. [ 540.870793] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Creating folder: Project (d75a81583a3b4cd58793514fba83cf70). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 540.871059] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-21692cdd-d27a-4210-aba5-56255e604c1c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 540.875235] env[60764]: DEBUG nova.compute.manager [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 540.880892] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Created folder: Project (d75a81583a3b4cd58793514fba83cf70) in parent group-v449629. [ 540.880983] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Creating folder: Instances. Parent ref: group-v449630. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 540.881186] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-328b4c76-aec6-41e1-8c36-cbafa16ac912 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 540.894836] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Created folder: Instances in parent group-v449630. [ 540.895177] env[60764]: DEBUG oslo.service.loopingcall [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 540.895427] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 540.895932] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5aa9750c-c838-415b-84fc-da73100e638e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 540.925269] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 540.925269] env[60764]: value = "task-2204823" [ 540.925269] env[60764]: _type = "Task" [ 540.925269] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 540.927893] env[60764]: DEBUG nova.virt.hardware [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 540.927893] env[60764]: DEBUG nova.virt.hardware [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 540.928093] env[60764]: DEBUG nova.virt.hardware [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 540.928138] env[60764]: DEBUG nova.virt.hardware [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 540.928360] env[60764]: DEBUG nova.virt.hardware [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 540.928568] env[60764]: DEBUG nova.virt.hardware [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 540.928732] env[60764]: DEBUG nova.virt.hardware [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 540.928888] env[60764]: DEBUG nova.virt.hardware [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 540.929071] env[60764]: DEBUG nova.virt.hardware [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 540.929238] env[60764]: DEBUG nova.virt.hardware [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 540.929404] env[60764]: DEBUG nova.virt.hardware [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 540.930839] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8deadeee-f9a4-4228-9e2b-db7c21a64df9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 540.942596] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f240fc1d-77ed-46b3-83f2-d722987596cf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 540.964899] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204823, 'name': CreateVM_Task} progress is 6%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 541.094750] env[60764]: DEBUG nova.policy [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '42e02cf5c1db4d16b3b480b19b033355', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '69242923ca144230a05279c778bfeca1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 541.420761] env[60764]: DEBUG nova.network.neutron [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Successfully updated port: 0add5ef6-4484-411b-b5f7-9e1be9acf8a2 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 541.438228] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Acquiring lock "refresh_cache-e60a6397-30ad-48cb-ab52-7ae977615dc3" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 541.438377] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Acquired lock "refresh_cache-e60a6397-30ad-48cb-ab52-7ae977615dc3" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 541.438542] env[60764]: DEBUG nova.network.neutron [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 541.446602] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204823, 'name': CreateVM_Task, 'duration_secs': 0.46338} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 541.447261] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 541.483627] env[60764]: DEBUG oslo_vmware.service [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78af9a15-f548-407a-9234-eeb68cb2ac31 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 541.491926] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 541.497368] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 541.498550] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 541.499075] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-03833a49-ab41-4c15-81ab-a3343e8d65f7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 541.507263] env[60764]: DEBUG oslo_vmware.api [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Waiting for the task: (returnval){ [ 541.507263] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]524657c8-b0e3-9993-3504-4f5f9e0dd475" [ 541.507263] env[60764]: _type = "Task" [ 541.507263] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 541.514945] env[60764]: DEBUG oslo_vmware.api [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]524657c8-b0e3-9993-3504-4f5f9e0dd475, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 541.617089] env[60764]: DEBUG nova.network.neutron [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 541.759364] env[60764]: DEBUG nova.compute.manager [req-578e8d6d-0633-4aa4-99fa-1636ab4963a3 req-f586f917-2c7b-44b8-b04b-5735358b3152 service nova] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Received event network-vif-plugged-4ddcb2ae-f096-4657-86bb-31627f621630 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 541.759566] env[60764]: DEBUG oslo_concurrency.lockutils [req-578e8d6d-0633-4aa4-99fa-1636ab4963a3 req-f586f917-2c7b-44b8-b04b-5735358b3152 service nova] Acquiring lock "116f360a-6080-46c7-8234-69fe54b9a147-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 541.759765] env[60764]: DEBUG oslo_concurrency.lockutils [req-578e8d6d-0633-4aa4-99fa-1636ab4963a3 req-f586f917-2c7b-44b8-b04b-5735358b3152 service nova] Lock "116f360a-6080-46c7-8234-69fe54b9a147-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 541.759919] env[60764]: DEBUG oslo_concurrency.lockutils [req-578e8d6d-0633-4aa4-99fa-1636ab4963a3 req-f586f917-2c7b-44b8-b04b-5735358b3152 service nova] Lock "116f360a-6080-46c7-8234-69fe54b9a147-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 541.763103] env[60764]: DEBUG nova.compute.manager [req-578e8d6d-0633-4aa4-99fa-1636ab4963a3 req-f586f917-2c7b-44b8-b04b-5735358b3152 service nova] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] No waiting events found dispatching network-vif-plugged-4ddcb2ae-f096-4657-86bb-31627f621630 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 541.763103] env[60764]: WARNING nova.compute.manager [req-578e8d6d-0633-4aa4-99fa-1636ab4963a3 req-f586f917-2c7b-44b8-b04b-5735358b3152 service nova] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Received unexpected event network-vif-plugged-4ddcb2ae-f096-4657-86bb-31627f621630 for instance with vm_state building and task_state spawning. [ 542.019359] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 542.019449] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 542.019693] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 542.019878] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 542.020349] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 542.020657] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4352faa9-048d-423e-9205-ef172afe1e58 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 542.039905] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 542.040097] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 542.040930] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c020006-4dd6-4630-b8f4-4d73d1050723 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 542.048872] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-86633063-f696-41be-bf71-b2f08faf8add {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 542.054473] env[60764]: DEBUG oslo_vmware.api [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Waiting for the task: (returnval){ [ 542.054473] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52274d4e-7e16-d8be-741f-c384de8fab59" [ 542.054473] env[60764]: _type = "Task" [ 542.054473] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 542.062645] env[60764]: DEBUG oslo_vmware.api [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52274d4e-7e16-d8be-741f-c384de8fab59, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 542.572543] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 542.572543] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Creating directory with path [datastore2] vmware_temp/90579739-38cc-46f1-8ae9-1af966a8f3d3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 542.573185] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0742bac1-0486-4807-b38e-45a98810aaf0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 542.596015] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Created directory with path [datastore2] vmware_temp/90579739-38cc-46f1-8ae9-1af966a8f3d3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 542.596918] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Fetch image to [datastore2] vmware_temp/90579739-38cc-46f1-8ae9-1af966a8f3d3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 542.596918] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/90579739-38cc-46f1-8ae9-1af966a8f3d3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 542.598221] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ea7d410-81e0-48b2-a819-ed6883529e2f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 542.619443] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3e1ff88-0db3-4143-a369-8ab731d8873f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 542.636550] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aae69d3c-75d7-4c23-88ad-18d7ea317d9a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 542.676113] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b9a074e-bcdf-41ff-b6b4-149c56c52b71 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 542.684012] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9e4f421b-535a-4c65-aa41-2b5cca7cac43 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 542.712459] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 542.725958] env[60764]: DEBUG nova.network.neutron [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Updating instance_info_cache with network_info: [{"id": "0add5ef6-4484-411b-b5f7-9e1be9acf8a2", "address": "fa:16:3e:bc:ce:6a", "network": {"id": "f8a6745b-cbbf-4c9f-8a95-14fa8637e238", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2129408972-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "03c71129d932454192e0f3c1d7ffecf1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b56036cd-97ac-47f5-9089-7b38bfe99228", "external-id": "nsx-vlan-transportzone-301", "segmentation_id": 301, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0add5ef6-44", "ovs_interfaceid": "0add5ef6-4484-411b-b5f7-9e1be9acf8a2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 542.744563] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Releasing lock "refresh_cache-e60a6397-30ad-48cb-ab52-7ae977615dc3" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 542.744655] env[60764]: DEBUG nova.compute.manager [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Instance network_info: |[{"id": "0add5ef6-4484-411b-b5f7-9e1be9acf8a2", "address": "fa:16:3e:bc:ce:6a", "network": {"id": "f8a6745b-cbbf-4c9f-8a95-14fa8637e238", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2129408972-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "03c71129d932454192e0f3c1d7ffecf1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b56036cd-97ac-47f5-9089-7b38bfe99228", "external-id": "nsx-vlan-transportzone-301", "segmentation_id": 301, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0add5ef6-44", "ovs_interfaceid": "0add5ef6-4484-411b-b5f7-9e1be9acf8a2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 542.747225] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:bc:ce:6a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b56036cd-97ac-47f5-9089-7b38bfe99228', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0add5ef6-4484-411b-b5f7-9e1be9acf8a2', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 542.759032] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Creating folder: Project (03c71129d932454192e0f3c1d7ffecf1). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 542.767023] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-50abd8f8-87ba-4e74-9b38-8f975ade6bcf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 542.779723] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Created folder: Project (03c71129d932454192e0f3c1d7ffecf1) in parent group-v449629. [ 542.779723] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Creating folder: Instances. Parent ref: group-v449633. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 542.779723] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e869059f-12c5-4304-ba96-483dda1fad83 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 542.789494] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Created folder: Instances in parent group-v449633. [ 542.789677] env[60764]: DEBUG oslo.service.loopingcall [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 542.792507] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 542.792507] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cf4e282b-b11e-4491-b6e0-7ef0ea9d91b5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 542.816922] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 542.816922] env[60764]: value = "task-2204826" [ 542.816922] env[60764]: _type = "Task" [ 542.816922] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 542.830516] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204826, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 542.833011] env[60764]: DEBUG oslo_vmware.rw_handles [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/90579739-38cc-46f1-8ae9-1af966a8f3d3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 542.911279] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquiring lock "bea83327-9479-46b2-bd78-c81d72359e8a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 542.911689] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "bea83327-9479-46b2-bd78-c81d72359e8a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 542.916119] env[60764]: DEBUG oslo_vmware.rw_handles [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 542.916119] env[60764]: DEBUG oslo_vmware.rw_handles [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/90579739-38cc-46f1-8ae9-1af966a8f3d3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 542.930687] env[60764]: DEBUG nova.compute.manager [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 542.996184] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 542.996443] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 543.001517] env[60764]: INFO nova.compute.claims [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 543.101972] env[60764]: DEBUG nova.network.neutron [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Successfully created port: 0893ced5-400b-44f6-bec0-902a455f1c36 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 543.219851] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25a1696e-dc79-4105-9036-e8de98971b36 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 543.229588] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7832bdf6-8abd-4ba3-8f88-10049214e81b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 543.275047] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bc1ca3b-fd42-4909-9077-3662cf4b10c1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 543.283798] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6c37180-4fae-44a4-af4f-506b1617bc04 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 543.300151] env[60764]: DEBUG nova.compute.provider_tree [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 543.317391] env[60764]: DEBUG nova.scheduler.client.report [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 543.334604] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204826, 'name': CreateVM_Task, 'duration_secs': 0.334613} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 543.334604] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 543.334832] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 543.334871] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 543.335257] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 543.335511] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7fb2c8c8-f9dd-4671-ab99-71a9a21e93b1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 543.341021] env[60764]: DEBUG oslo_vmware.api [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Waiting for the task: (returnval){ [ 543.341021] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52309943-91c8-2fd7-e5be-0bfc6fbedc81" [ 543.341021] env[60764]: _type = "Task" [ 543.341021] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 543.348387] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.352s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 543.348966] env[60764]: DEBUG nova.compute.manager [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 543.355605] env[60764]: DEBUG oslo_vmware.api [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52309943-91c8-2fd7-e5be-0bfc6fbedc81, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 543.404058] env[60764]: DEBUG nova.compute.utils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 543.407142] env[60764]: DEBUG nova.compute.manager [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 543.407142] env[60764]: DEBUG nova.network.neutron [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 543.421567] env[60764]: DEBUG nova.compute.manager [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 543.503278] env[60764]: DEBUG nova.compute.manager [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 543.528248] env[60764]: DEBUG nova.virt.hardware [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 543.530418] env[60764]: DEBUG nova.virt.hardware [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 543.530418] env[60764]: DEBUG nova.virt.hardware [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 543.530418] env[60764]: DEBUG nova.virt.hardware [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 543.530418] env[60764]: DEBUG nova.virt.hardware [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 543.530418] env[60764]: DEBUG nova.virt.hardware [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 543.530660] env[60764]: DEBUG nova.virt.hardware [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 543.530660] env[60764]: DEBUG nova.virt.hardware [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 543.530660] env[60764]: DEBUG nova.virt.hardware [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 543.530660] env[60764]: DEBUG nova.virt.hardware [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 543.530660] env[60764]: DEBUG nova.virt.hardware [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 543.530989] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7f5e3dd-e989-4c11-b3e3-eef004ff2b75 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 543.540938] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d64fb390-4774-4683-90ef-bdd08451ea59 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 543.709806] env[60764]: DEBUG nova.network.neutron [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Successfully updated port: 6d2b1df4-c829-4a84-95b1-2d1cf877b993 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 543.722612] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Acquiring lock "refresh_cache-2f530484-d828-4b65-a81e-c1c1a84ec903" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 543.722894] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Acquired lock "refresh_cache-2f530484-d828-4b65-a81e-c1c1a84ec903" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 543.722969] env[60764]: DEBUG nova.network.neutron [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 543.765246] env[60764]: DEBUG nova.policy [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1f32cefba46c4c1699d69409b4eb6147', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2a4e6f7c3621435f897f8009f1693251', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 543.817413] env[60764]: DEBUG nova.network.neutron [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Successfully updated port: 8fd04950-d805-4008-8a6d-bceca7b98edc {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 543.839668] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Acquiring lock "refresh_cache-673185d0-9e2c-49dc-8323-f8b30a65b59d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 543.839800] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Acquired lock "refresh_cache-673185d0-9e2c-49dc-8323-f8b30a65b59d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 543.840649] env[60764]: DEBUG nova.network.neutron [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 543.862151] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 543.862151] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 543.862315] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 543.875682] env[60764]: DEBUG nova.network.neutron [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 544.015382] env[60764]: DEBUG nova.network.neutron [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 544.574444] env[60764]: DEBUG nova.network.neutron [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Updating instance_info_cache with network_info: [{"id": "6d2b1df4-c829-4a84-95b1-2d1cf877b993", "address": "fa:16:3e:9d:51:d9", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.128", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6d2b1df4-c8", "ovs_interfaceid": "6d2b1df4-c829-4a84-95b1-2d1cf877b993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 544.588348] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Releasing lock "refresh_cache-2f530484-d828-4b65-a81e-c1c1a84ec903" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 544.588614] env[60764]: DEBUG nova.compute.manager [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Instance network_info: |[{"id": "6d2b1df4-c829-4a84-95b1-2d1cf877b993", "address": "fa:16:3e:9d:51:d9", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.128", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6d2b1df4-c8", "ovs_interfaceid": "6d2b1df4-c829-4a84-95b1-2d1cf877b993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 544.589315] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9d:51:d9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9f87a752-ebb0-49a4-a67b-e356fa45b89b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6d2b1df4-c829-4a84-95b1-2d1cf877b993', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 544.601646] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Creating folder: Project (1a9ba07dcf124300ba7f9740e3da1a4c). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 544.603118] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3d4755d0-413b-4c3a-b6c6-a2d4414eb864 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 544.615057] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Created folder: Project (1a9ba07dcf124300ba7f9740e3da1a4c) in parent group-v449629. [ 544.615321] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Creating folder: Instances. Parent ref: group-v449636. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 544.615568] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f1424586-d0de-4238-9c40-fd2be4d5831f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 544.625495] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Created folder: Instances in parent group-v449636. [ 544.625804] env[60764]: DEBUG oslo.service.loopingcall [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 544.625919] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 544.626341] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f5ebab81-3870-4d51-bbd1-c72f3ae0e1cc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 544.641670] env[60764]: DEBUG nova.network.neutron [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Updating instance_info_cache with network_info: [{"id": "8fd04950-d805-4008-8a6d-bceca7b98edc", "address": "fa:16:3e:70:3d:21", "network": {"id": "7bfa728f-5596-4d9c-80cc-07e28bf59b24", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-372150778-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "596ed3da89b84f9d8fcf5d9e94002377", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8fd04950-d8", "ovs_interfaceid": "8fd04950-d805-4008-8a6d-bceca7b98edc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 544.649116] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 544.649116] env[60764]: value = "task-2204829" [ 544.649116] env[60764]: _type = "Task" [ 544.649116] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 544.656169] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204829, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 544.656998] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Releasing lock "refresh_cache-673185d0-9e2c-49dc-8323-f8b30a65b59d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 544.657909] env[60764]: DEBUG nova.compute.manager [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Instance network_info: |[{"id": "8fd04950-d805-4008-8a6d-bceca7b98edc", "address": "fa:16:3e:70:3d:21", "network": {"id": "7bfa728f-5596-4d9c-80cc-07e28bf59b24", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-372150778-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "596ed3da89b84f9d8fcf5d9e94002377", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8fd04950-d8", "ovs_interfaceid": "8fd04950-d805-4008-8a6d-bceca7b98edc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 544.658648] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:70:3d:21', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ad4fcde7-8926-402a-a9b7-4878d2bc1cf6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8fd04950-d805-4008-8a6d-bceca7b98edc', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 544.666626] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Creating folder: Project (596ed3da89b84f9d8fcf5d9e94002377). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 544.667422] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-aa3c5b43-f4e1-4337-bc33-d7979c9ee05b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 544.678297] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Created folder: Project (596ed3da89b84f9d8fcf5d9e94002377) in parent group-v449629. [ 544.678475] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Creating folder: Instances. Parent ref: group-v449638. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 544.678942] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4b9f46b7-c131-4e2f-8c29-3270ba31d0a2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 544.689965] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Created folder: Instances in parent group-v449638. [ 544.689965] env[60764]: DEBUG oslo.service.loopingcall [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 544.689965] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 544.689965] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-aaf00910-4b78-413f-a756-8c1141d43c8b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 544.712147] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 544.712147] env[60764]: value = "task-2204832" [ 544.712147] env[60764]: _type = "Task" [ 544.712147] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 544.720487] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204832, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 545.072384] env[60764]: DEBUG nova.compute.manager [req-661e381b-d5bb-41a7-bffd-8913c2a27cbc req-823bc69c-4ef7-466b-b7d2-8f6b3f8fce2c service nova] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Received event network-vif-plugged-6d2b1df4-c829-4a84-95b1-2d1cf877b993 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 545.072655] env[60764]: DEBUG oslo_concurrency.lockutils [req-661e381b-d5bb-41a7-bffd-8913c2a27cbc req-823bc69c-4ef7-466b-b7d2-8f6b3f8fce2c service nova] Acquiring lock "2f530484-d828-4b65-a81e-c1c1a84ec903-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 545.072849] env[60764]: DEBUG oslo_concurrency.lockutils [req-661e381b-d5bb-41a7-bffd-8913c2a27cbc req-823bc69c-4ef7-466b-b7d2-8f6b3f8fce2c service nova] Lock "2f530484-d828-4b65-a81e-c1c1a84ec903-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 545.073026] env[60764]: DEBUG oslo_concurrency.lockutils [req-661e381b-d5bb-41a7-bffd-8913c2a27cbc req-823bc69c-4ef7-466b-b7d2-8f6b3f8fce2c service nova] Lock "2f530484-d828-4b65-a81e-c1c1a84ec903-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 545.073803] env[60764]: DEBUG nova.compute.manager [req-661e381b-d5bb-41a7-bffd-8913c2a27cbc req-823bc69c-4ef7-466b-b7d2-8f6b3f8fce2c service nova] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] No waiting events found dispatching network-vif-plugged-6d2b1df4-c829-4a84-95b1-2d1cf877b993 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 545.073910] env[60764]: WARNING nova.compute.manager [req-661e381b-d5bb-41a7-bffd-8913c2a27cbc req-823bc69c-4ef7-466b-b7d2-8f6b3f8fce2c service nova] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Received unexpected event network-vif-plugged-6d2b1df4-c829-4a84-95b1-2d1cf877b993 for instance with vm_state building and task_state spawning. [ 545.162514] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204829, 'name': CreateVM_Task, 'duration_secs': 0.368625} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 545.162514] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 545.162699] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 545.163080] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 545.163277] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 545.166283] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3b4fc597-7026-460a-8a7a-c3f6efad0339 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 545.169084] env[60764]: DEBUG oslo_vmware.api [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Waiting for the task: (returnval){ [ 545.169084] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]521d351d-c4bc-c92b-bd81-fea656d7198e" [ 545.169084] env[60764]: _type = "Task" [ 545.169084] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 545.176756] env[60764]: DEBUG oslo_vmware.api [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]521d351d-c4bc-c92b-bd81-fea656d7198e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 545.224231] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204832, 'name': CreateVM_Task, 'duration_secs': 0.338846} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 545.224497] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 545.225076] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 545.272615] env[60764]: DEBUG nova.network.neutron [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Successfully created port: 7eb10c62-134f-4f3a-97de-e994655830d6 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 545.359371] env[60764]: DEBUG nova.network.neutron [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Successfully updated port: 0893ced5-400b-44f6-bec0-902a455f1c36 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 545.371749] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Acquiring lock "refresh_cache-2696525a-3366-45a1-b413-8e4e0bd9d6c6" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 545.371891] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Acquired lock "refresh_cache-2696525a-3366-45a1-b413-8e4e0bd9d6c6" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 545.372064] env[60764]: DEBUG nova.network.neutron [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 545.559140] env[60764]: DEBUG nova.compute.manager [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Received event network-changed-4ddcb2ae-f096-4657-86bb-31627f621630 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 545.559391] env[60764]: DEBUG nova.compute.manager [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Refreshing instance network info cache due to event network-changed-4ddcb2ae-f096-4657-86bb-31627f621630. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 545.559542] env[60764]: DEBUG oslo_concurrency.lockutils [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] Acquiring lock "refresh_cache-116f360a-6080-46c7-8234-69fe54b9a147" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 545.559674] env[60764]: DEBUG oslo_concurrency.lockutils [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] Acquired lock "refresh_cache-116f360a-6080-46c7-8234-69fe54b9a147" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 545.559884] env[60764]: DEBUG nova.network.neutron [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Refreshing network info cache for port 4ddcb2ae-f096-4657-86bb-31627f621630 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 545.596980] env[60764]: DEBUG nova.network.neutron [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 545.685106] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 545.685540] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 545.685642] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 545.685801] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 545.686161] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 545.686500] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7c33d31a-3bf5-433d-8eb6-0e8cd66ddbaf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 545.692986] env[60764]: DEBUG oslo_vmware.api [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Waiting for the task: (returnval){ [ 545.692986] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5246b80f-5736-20df-c5b1-ef0635ab3de5" [ 545.692986] env[60764]: _type = "Task" [ 545.692986] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 545.703160] env[60764]: DEBUG oslo_vmware.api [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5246b80f-5736-20df-c5b1-ef0635ab3de5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 546.025219] env[60764]: DEBUG nova.network.neutron [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Updated VIF entry in instance network info cache for port 4ddcb2ae-f096-4657-86bb-31627f621630. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 546.025647] env[60764]: DEBUG nova.network.neutron [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Updating instance_info_cache with network_info: [{"id": "4ddcb2ae-f096-4657-86bb-31627f621630", "address": "fa:16:3e:b6:c7:a7", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.254", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4ddcb2ae-f0", "ovs_interfaceid": "4ddcb2ae-f096-4657-86bb-31627f621630", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 546.038553] env[60764]: DEBUG oslo_concurrency.lockutils [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] Releasing lock "refresh_cache-116f360a-6080-46c7-8234-69fe54b9a147" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 546.040475] env[60764]: DEBUG nova.compute.manager [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Received event network-vif-plugged-0add5ef6-4484-411b-b5f7-9e1be9acf8a2 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 546.040475] env[60764]: DEBUG oslo_concurrency.lockutils [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] Acquiring lock "e60a6397-30ad-48cb-ab52-7ae977615dc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 546.040475] env[60764]: DEBUG oslo_concurrency.lockutils [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] Lock "e60a6397-30ad-48cb-ab52-7ae977615dc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 546.040475] env[60764]: DEBUG oslo_concurrency.lockutils [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] Lock "e60a6397-30ad-48cb-ab52-7ae977615dc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 546.040665] env[60764]: DEBUG nova.compute.manager [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] No waiting events found dispatching network-vif-plugged-0add5ef6-4484-411b-b5f7-9e1be9acf8a2 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 546.040665] env[60764]: WARNING nova.compute.manager [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Received unexpected event network-vif-plugged-0add5ef6-4484-411b-b5f7-9e1be9acf8a2 for instance with vm_state building and task_state spawning. [ 546.040665] env[60764]: DEBUG nova.compute.manager [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Received event network-changed-0add5ef6-4484-411b-b5f7-9e1be9acf8a2 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 546.040665] env[60764]: DEBUG nova.compute.manager [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Refreshing instance network info cache due to event network-changed-0add5ef6-4484-411b-b5f7-9e1be9acf8a2. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 546.040665] env[60764]: DEBUG oslo_concurrency.lockutils [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] Acquiring lock "refresh_cache-e60a6397-30ad-48cb-ab52-7ae977615dc3" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 546.040812] env[60764]: DEBUG oslo_concurrency.lockutils [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] Acquired lock "refresh_cache-e60a6397-30ad-48cb-ab52-7ae977615dc3" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 546.040927] env[60764]: DEBUG nova.network.neutron [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Refreshing network info cache for port 0add5ef6-4484-411b-b5f7-9e1be9acf8a2 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 546.207042] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 546.207349] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 546.207562] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 546.588253] env[60764]: DEBUG nova.network.neutron [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Updating instance_info_cache with network_info: [{"id": "0893ced5-400b-44f6-bec0-902a455f1c36", "address": "fa:16:3e:73:28:35", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0893ced5-40", "ovs_interfaceid": "0893ced5-400b-44f6-bec0-902a455f1c36", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 546.605896] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Releasing lock "refresh_cache-2696525a-3366-45a1-b413-8e4e0bd9d6c6" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 546.606210] env[60764]: DEBUG nova.compute.manager [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Instance network_info: |[{"id": "0893ced5-400b-44f6-bec0-902a455f1c36", "address": "fa:16:3e:73:28:35", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0893ced5-40", "ovs_interfaceid": "0893ced5-400b-44f6-bec0-902a455f1c36", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 546.606675] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:73:28:35', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9f87a752-ebb0-49a4-a67b-e356fa45b89b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0893ced5-400b-44f6-bec0-902a455f1c36', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 546.617454] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Creating folder: Project (69242923ca144230a05279c778bfeca1). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 546.618129] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2fb59452-ef4d-417b-8d6c-9ac655ee1f5e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 546.629644] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Created folder: Project (69242923ca144230a05279c778bfeca1) in parent group-v449629. [ 546.629935] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Creating folder: Instances. Parent ref: group-v449642. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 546.631180] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b231e237-3f27-4cf8-9454-e0a087b27e30 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 546.639113] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Created folder: Instances in parent group-v449642. [ 546.639381] env[60764]: DEBUG oslo.service.loopingcall [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 546.639574] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 546.639877] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-82bc1d1c-8bb0-48b0-9315-2e4f22645947 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 546.664149] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 546.664149] env[60764]: value = "task-2204835" [ 546.664149] env[60764]: _type = "Task" [ 546.664149] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 546.675662] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204835, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 546.688145] env[60764]: DEBUG nova.network.neutron [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Updated VIF entry in instance network info cache for port 0add5ef6-4484-411b-b5f7-9e1be9acf8a2. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 546.688545] env[60764]: DEBUG nova.network.neutron [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Updating instance_info_cache with network_info: [{"id": "0add5ef6-4484-411b-b5f7-9e1be9acf8a2", "address": "fa:16:3e:bc:ce:6a", "network": {"id": "f8a6745b-cbbf-4c9f-8a95-14fa8637e238", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2129408972-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "03c71129d932454192e0f3c1d7ffecf1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b56036cd-97ac-47f5-9089-7b38bfe99228", "external-id": "nsx-vlan-transportzone-301", "segmentation_id": 301, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0add5ef6-44", "ovs_interfaceid": "0add5ef6-4484-411b-b5f7-9e1be9acf8a2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 546.702734] env[60764]: DEBUG oslo_concurrency.lockutils [req-576729d6-ae48-47d4-9050-55a1c73b6cf3 req-e4fb7962-9ce1-41ef-b5c1-290f8e21c7d0 service nova] Releasing lock "refresh_cache-e60a6397-30ad-48cb-ab52-7ae977615dc3" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 547.182283] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204835, 'name': CreateVM_Task, 'duration_secs': 0.343752} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 547.182557] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 547.183351] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 547.183468] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 547.183809] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 547.184096] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5e06b444-5a74-4e5e-8f1f-93dfc92f615e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 547.190536] env[60764]: DEBUG oslo_vmware.api [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Waiting for the task: (returnval){ [ 547.190536] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52a4dd34-b8d2-6ee6-5c42-689e4811fbea" [ 547.190536] env[60764]: _type = "Task" [ 547.190536] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 547.204290] env[60764]: DEBUG oslo_vmware.api [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52a4dd34-b8d2-6ee6-5c42-689e4811fbea, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 547.704080] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 547.704856] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 547.706380] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 548.240380] env[60764]: DEBUG nova.network.neutron [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Successfully updated port: 7eb10c62-134f-4f3a-97de-e994655830d6 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 548.261324] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquiring lock "refresh_cache-bea83327-9479-46b2-bd78-c81d72359e8a" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 548.261628] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquired lock "refresh_cache-bea83327-9479-46b2-bd78-c81d72359e8a" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 548.261896] env[60764]: DEBUG nova.network.neutron [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 548.361779] env[60764]: DEBUG nova.network.neutron [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 548.863582] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Acquiring lock "8d32e9d3-43ff-47dd-a52b-102edbe3af11" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 548.865031] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Lock "8d32e9d3-43ff-47dd-a52b-102edbe3af11" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 548.881960] env[60764]: DEBUG nova.compute.manager [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 548.949696] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 548.949696] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 548.950326] env[60764]: INFO nova.compute.claims [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 549.095429] env[60764]: DEBUG nova.network.neutron [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Updating instance_info_cache with network_info: [{"id": "7eb10c62-134f-4f3a-97de-e994655830d6", "address": "fa:16:3e:51:0b:18", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.131", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7eb10c62-13", "ovs_interfaceid": "7eb10c62-134f-4f3a-97de-e994655830d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 549.120248] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Releasing lock "refresh_cache-bea83327-9479-46b2-bd78-c81d72359e8a" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 549.120388] env[60764]: DEBUG nova.compute.manager [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Instance network_info: |[{"id": "7eb10c62-134f-4f3a-97de-e994655830d6", "address": "fa:16:3e:51:0b:18", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.131", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7eb10c62-13", "ovs_interfaceid": "7eb10c62-134f-4f3a-97de-e994655830d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 549.122540] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:51:0b:18', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9f87a752-ebb0-49a4-a67b-e356fa45b89b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7eb10c62-134f-4f3a-97de-e994655830d6', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 549.132794] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Creating folder: Project (2a4e6f7c3621435f897f8009f1693251). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 549.138757] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7ecb88b9-0d60-482c-b227-0a59d4b694c8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.149051] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Created folder: Project (2a4e6f7c3621435f897f8009f1693251) in parent group-v449629. [ 549.149901] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Creating folder: Instances. Parent ref: group-v449645. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 549.149901] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fbb42d9b-ff33-4448-9d70-cefd0737de48 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.153689] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd2f8266-e01e-4749-996d-fac88e0db39a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.161417] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdce410b-c01e-4b1c-a802-683ba1d72078 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.165942] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Created folder: Instances in parent group-v449645. [ 549.166231] env[60764]: DEBUG oslo.service.loopingcall [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 549.167120] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 549.167365] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c7de8237-ed9f-4125-b616-2671f5df73e7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.216472] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cda3d46b-add6-4fd6-a5d6-fdb4c7afa912 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.220895] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 549.220895] env[60764]: value = "task-2204838" [ 549.220895] env[60764]: _type = "Task" [ 549.220895] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 549.227421] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0bf6292-3f08-4f8b-9571-b57107717068 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.242972] env[60764]: DEBUG nova.compute.provider_tree [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 549.247481] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204838, 'name': CreateVM_Task} progress is 6%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 549.254384] env[60764]: DEBUG nova.scheduler.client.report [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 549.275757] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.327s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 549.276800] env[60764]: DEBUG nova.compute.manager [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 549.342124] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 549.342124] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 549.342212] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 549.342392] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 549.358464] env[60764]: DEBUG nova.compute.utils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 549.363163] env[60764]: DEBUG nova.compute.manager [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Not allocating networking since 'none' was specified. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 549.374646] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 549.374646] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 549.374646] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 549.374646] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 549.374646] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 549.375117] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 549.375117] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 549.375117] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 549.375677] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 549.375995] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 549.376272] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 549.376344] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 549.376939] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 549.376939] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 549.377239] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 549.377738] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 549.391826] env[60764]: DEBUG nova.compute.manager [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 549.399951] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 549.399951] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 549.399951] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 549.399951] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 549.401102] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea40b3bb-ff38-4cf5-af1b-cf1ba3c40183 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.412035] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d83c15f-a827-4b2f-95e9-20c705eabe96 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.428416] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9335fdf6-c4a2-4e29-8592-26bcaa3c39e5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.436024] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7aff15f4-7620-46ea-a9d0-0af66d1270aa {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.471447] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181275MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 549.471636] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 549.471889] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 549.537220] env[60764]: DEBUG nova.compute.manager [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 549.577337] env[60764]: DEBUG nova.virt.hardware [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 549.577596] env[60764]: DEBUG nova.virt.hardware [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 549.578156] env[60764]: DEBUG nova.virt.hardware [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 549.578156] env[60764]: DEBUG nova.virt.hardware [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 549.578156] env[60764]: DEBUG nova.virt.hardware [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 549.578329] env[60764]: DEBUG nova.virt.hardware [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 549.578488] env[60764]: DEBUG nova.virt.hardware [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 549.578625] env[60764]: DEBUG nova.virt.hardware [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 549.578786] env[60764]: DEBUG nova.virt.hardware [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 549.578943] env[60764]: DEBUG nova.virt.hardware [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 549.579126] env[60764]: DEBUG nova.virt.hardware [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 549.580128] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 116f360a-6080-46c7-8234-69fe54b9a147 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 549.580276] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e60a6397-30ad-48cb-ab52-7ae977615dc3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 549.580407] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2f530484-d828-4b65-a81e-c1c1a84ec903 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 549.580530] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 673185d0-9e2c-49dc-8323-f8b30a65b59d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 549.580650] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2696525a-3366-45a1-b413-8e4e0bd9d6c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 549.581046] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance bea83327-9479-46b2-bd78-c81d72359e8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 549.581046] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 8d32e9d3-43ff-47dd-a52b-102edbe3af11 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 549.581163] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 549.581226] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 549.584128] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbdf841a-cefe-46bf-8de9-e4063900d608 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.595856] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b60ba14-9bcf-431a-9ed9-7816e93a46fd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.606830] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Instance VIF info [] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 549.612563] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Creating folder: Project (b544e025cbb8482aa85f66ad82fff59b). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 549.615631] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b84cb6d1-ef42-4f3f-b05e-b8297d2e620f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.624698] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Created folder: Project (b544e025cbb8482aa85f66ad82fff59b) in parent group-v449629. [ 549.624917] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Creating folder: Instances. Parent ref: group-v449648. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 549.625143] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c7aec1fd-440a-445f-b31e-f2f4df696880 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.633641] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Created folder: Instances in parent group-v449648. [ 549.633893] env[60764]: DEBUG oslo.service.loopingcall [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 549.634168] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 549.634336] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-244e3c18-3511-429c-a166-e2e8c8207d30 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.654130] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 549.654130] env[60764]: value = "task-2204841" [ 549.654130] env[60764]: _type = "Task" [ 549.654130] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 549.661324] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204841, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 549.737737] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204838, 'name': CreateVM_Task} progress is 99%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 549.750056] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af4e5056-d90c-4ca7-8f96-09f03c842e7e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.755492] env[60764]: DEBUG nova.compute.manager [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Received event network-vif-plugged-8fd04950-d805-4008-8a6d-bceca7b98edc {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 549.755681] env[60764]: DEBUG oslo_concurrency.lockutils [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] Acquiring lock "673185d0-9e2c-49dc-8323-f8b30a65b59d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 549.755883] env[60764]: DEBUG oslo_concurrency.lockutils [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] Lock "673185d0-9e2c-49dc-8323-f8b30a65b59d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 549.756060] env[60764]: DEBUG oslo_concurrency.lockutils [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] Lock "673185d0-9e2c-49dc-8323-f8b30a65b59d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 549.756226] env[60764]: DEBUG nova.compute.manager [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] No waiting events found dispatching network-vif-plugged-8fd04950-d805-4008-8a6d-bceca7b98edc {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 549.756380] env[60764]: WARNING nova.compute.manager [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Received unexpected event network-vif-plugged-8fd04950-d805-4008-8a6d-bceca7b98edc for instance with vm_state building and task_state spawning. [ 549.756531] env[60764]: DEBUG nova.compute.manager [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Received event network-changed-6d2b1df4-c829-4a84-95b1-2d1cf877b993 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 549.756680] env[60764]: DEBUG nova.compute.manager [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Refreshing instance network info cache due to event network-changed-6d2b1df4-c829-4a84-95b1-2d1cf877b993. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 549.756925] env[60764]: DEBUG oslo_concurrency.lockutils [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] Acquiring lock "refresh_cache-2f530484-d828-4b65-a81e-c1c1a84ec903" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 549.756975] env[60764]: DEBUG oslo_concurrency.lockutils [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] Acquired lock "refresh_cache-2f530484-d828-4b65-a81e-c1c1a84ec903" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 549.757136] env[60764]: DEBUG nova.network.neutron [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Refreshing network info cache for port 6d2b1df4-c829-4a84-95b1-2d1cf877b993 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 549.763167] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e5abd3e-a7f6-4a81-b0d5-1c5ee9846b39 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.797225] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-127c2b75-b6be-4fb4-a730-759316113412 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.809156] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cca3e3e-c8b6-4277-a5a1-ff2e39fd46d1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 549.824218] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 549.843557] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 549.869745] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 549.869745] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.398s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 550.165179] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204841, 'name': CreateVM_Task, 'duration_secs': 0.432605} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 550.165430] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 550.165959] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 550.166150] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 550.166470] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 550.166729] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bfc8c576-85a1-40b3-8a46-230a7ff8021c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 550.172030] env[60764]: DEBUG oslo_vmware.api [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Waiting for the task: (returnval){ [ 550.172030] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52aa799a-27c4-756e-701d-76708cc20465" [ 550.172030] env[60764]: _type = "Task" [ 550.172030] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 550.190640] env[60764]: DEBUG oslo_vmware.api [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52aa799a-27c4-756e-701d-76708cc20465, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 550.237673] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204838, 'name': CreateVM_Task, 'duration_secs': 0.883618} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 550.237952] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 550.238774] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 550.386273] env[60764]: DEBUG nova.compute.manager [req-9bafa14c-4d31-42a1-99b5-993b2e2bac91 req-1127a9e2-9c16-4788-b4e5-afe6c453f591 service nova] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Received event network-vif-plugged-0893ced5-400b-44f6-bec0-902a455f1c36 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 550.388213] env[60764]: DEBUG oslo_concurrency.lockutils [req-9bafa14c-4d31-42a1-99b5-993b2e2bac91 req-1127a9e2-9c16-4788-b4e5-afe6c453f591 service nova] Acquiring lock "2696525a-3366-45a1-b413-8e4e0bd9d6c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 550.388497] env[60764]: DEBUG oslo_concurrency.lockutils [req-9bafa14c-4d31-42a1-99b5-993b2e2bac91 req-1127a9e2-9c16-4788-b4e5-afe6c453f591 service nova] Lock "2696525a-3366-45a1-b413-8e4e0bd9d6c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 550.388621] env[60764]: DEBUG oslo_concurrency.lockutils [req-9bafa14c-4d31-42a1-99b5-993b2e2bac91 req-1127a9e2-9c16-4788-b4e5-afe6c453f591 service nova] Lock "2696525a-3366-45a1-b413-8e4e0bd9d6c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 550.388768] env[60764]: DEBUG nova.compute.manager [req-9bafa14c-4d31-42a1-99b5-993b2e2bac91 req-1127a9e2-9c16-4788-b4e5-afe6c453f591 service nova] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] No waiting events found dispatching network-vif-plugged-0893ced5-400b-44f6-bec0-902a455f1c36 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 550.390031] env[60764]: WARNING nova.compute.manager [req-9bafa14c-4d31-42a1-99b5-993b2e2bac91 req-1127a9e2-9c16-4788-b4e5-afe6c453f591 service nova] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Received unexpected event network-vif-plugged-0893ced5-400b-44f6-bec0-902a455f1c36 for instance with vm_state building and task_state spawning. [ 550.390031] env[60764]: DEBUG nova.compute.manager [req-9bafa14c-4d31-42a1-99b5-993b2e2bac91 req-1127a9e2-9c16-4788-b4e5-afe6c453f591 service nova] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Received event network-changed-0893ced5-400b-44f6-bec0-902a455f1c36 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 550.390031] env[60764]: DEBUG nova.compute.manager [req-9bafa14c-4d31-42a1-99b5-993b2e2bac91 req-1127a9e2-9c16-4788-b4e5-afe6c453f591 service nova] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Refreshing instance network info cache due to event network-changed-0893ced5-400b-44f6-bec0-902a455f1c36. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 550.390031] env[60764]: DEBUG oslo_concurrency.lockutils [req-9bafa14c-4d31-42a1-99b5-993b2e2bac91 req-1127a9e2-9c16-4788-b4e5-afe6c453f591 service nova] Acquiring lock "refresh_cache-2696525a-3366-45a1-b413-8e4e0bd9d6c6" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 550.390031] env[60764]: DEBUG oslo_concurrency.lockutils [req-9bafa14c-4d31-42a1-99b5-993b2e2bac91 req-1127a9e2-9c16-4788-b4e5-afe6c453f591 service nova] Acquired lock "refresh_cache-2696525a-3366-45a1-b413-8e4e0bd9d6c6" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 550.390564] env[60764]: DEBUG nova.network.neutron [req-9bafa14c-4d31-42a1-99b5-993b2e2bac91 req-1127a9e2-9c16-4788-b4e5-afe6c453f591 service nova] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Refreshing network info cache for port 0893ced5-400b-44f6-bec0-902a455f1c36 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 550.684468] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 550.684759] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 550.685089] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 550.685203] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 550.685505] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 550.686128] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7ae5596c-67ce-4381-b1d4-0e00724ebea9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 550.690918] env[60764]: DEBUG oslo_vmware.api [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Waiting for the task: (returnval){ [ 550.690918] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52989658-618e-b55a-4643-003bf68adeea" [ 550.690918] env[60764]: _type = "Task" [ 550.690918] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 550.699833] env[60764]: DEBUG oslo_vmware.api [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52989658-618e-b55a-4643-003bf68adeea, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 550.929715] env[60764]: DEBUG nova.network.neutron [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Updated VIF entry in instance network info cache for port 6d2b1df4-c829-4a84-95b1-2d1cf877b993. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 550.929715] env[60764]: DEBUG nova.network.neutron [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Updating instance_info_cache with network_info: [{"id": "6d2b1df4-c829-4a84-95b1-2d1cf877b993", "address": "fa:16:3e:9d:51:d9", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.128", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6d2b1df4-c8", "ovs_interfaceid": "6d2b1df4-c829-4a84-95b1-2d1cf877b993", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 550.946633] env[60764]: DEBUG oslo_concurrency.lockutils [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] Releasing lock "refresh_cache-2f530484-d828-4b65-a81e-c1c1a84ec903" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 550.946956] env[60764]: DEBUG nova.compute.manager [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Received event network-changed-8fd04950-d805-4008-8a6d-bceca7b98edc {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 550.947167] env[60764]: DEBUG nova.compute.manager [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Refreshing instance network info cache due to event network-changed-8fd04950-d805-4008-8a6d-bceca7b98edc. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 550.947351] env[60764]: DEBUG oslo_concurrency.lockutils [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] Acquiring lock "refresh_cache-673185d0-9e2c-49dc-8323-f8b30a65b59d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 550.947484] env[60764]: DEBUG oslo_concurrency.lockutils [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] Acquired lock "refresh_cache-673185d0-9e2c-49dc-8323-f8b30a65b59d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 550.947640] env[60764]: DEBUG nova.network.neutron [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Refreshing network info cache for port 8fd04950-d805-4008-8a6d-bceca7b98edc {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 551.205573] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 551.205843] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 551.206066] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 551.689730] env[60764]: DEBUG nova.network.neutron [req-9bafa14c-4d31-42a1-99b5-993b2e2bac91 req-1127a9e2-9c16-4788-b4e5-afe6c453f591 service nova] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Updated VIF entry in instance network info cache for port 0893ced5-400b-44f6-bec0-902a455f1c36. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 551.692057] env[60764]: DEBUG nova.network.neutron [req-9bafa14c-4d31-42a1-99b5-993b2e2bac91 req-1127a9e2-9c16-4788-b4e5-afe6c453f591 service nova] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Updating instance_info_cache with network_info: [{"id": "0893ced5-400b-44f6-bec0-902a455f1c36", "address": "fa:16:3e:73:28:35", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0893ced5-40", "ovs_interfaceid": "0893ced5-400b-44f6-bec0-902a455f1c36", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 551.703617] env[60764]: DEBUG oslo_concurrency.lockutils [req-9bafa14c-4d31-42a1-99b5-993b2e2bac91 req-1127a9e2-9c16-4788-b4e5-afe6c453f591 service nova] Releasing lock "refresh_cache-2696525a-3366-45a1-b413-8e4e0bd9d6c6" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 551.846976] env[60764]: DEBUG nova.network.neutron [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Updated VIF entry in instance network info cache for port 8fd04950-d805-4008-8a6d-bceca7b98edc. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 551.847348] env[60764]: DEBUG nova.network.neutron [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Updating instance_info_cache with network_info: [{"id": "8fd04950-d805-4008-8a6d-bceca7b98edc", "address": "fa:16:3e:70:3d:21", "network": {"id": "7bfa728f-5596-4d9c-80cc-07e28bf59b24", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-372150778-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "596ed3da89b84f9d8fcf5d9e94002377", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8fd04950-d8", "ovs_interfaceid": "8fd04950-d805-4008-8a6d-bceca7b98edc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 551.862295] env[60764]: DEBUG oslo_concurrency.lockutils [req-6d8dda87-7242-4dbf-80a0-365ab1583b95 req-a3c50f51-bb0a-4f1b-93bc-51decfd86944 service nova] Releasing lock "refresh_cache-673185d0-9e2c-49dc-8323-f8b30a65b59d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 553.266038] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Acquiring lock "4a0b0a82-3910-4201-b1f7-34c862667e3c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 553.266358] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Lock "4a0b0a82-3910-4201-b1f7-34c862667e3c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 553.271188] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Acquiring lock "7957eb49-d540-4c4e-a86a-1ea3631fb5ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 553.271402] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Lock "7957eb49-d540-4c4e-a86a-1ea3631fb5ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 553.283342] env[60764]: DEBUG nova.compute.manager [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 553.294092] env[60764]: DEBUG nova.compute.manager [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 553.403267] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 553.403513] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 553.408248] env[60764]: INFO nova.compute.claims [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 553.414444] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 553.725508] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fca15437-2a7e-4a0d-8653-c976c25f9684 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 553.737131] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fc09e8d-a704-40b0-b028-486a611db0b0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 553.790643] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da1cf4d9-0a94-440a-b0d6-265cf1a4ec03 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 553.799273] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b8398b7-51eb-419e-a88e-38666cb395ec {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 553.814984] env[60764]: DEBUG nova.compute.provider_tree [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 553.830928] env[60764]: DEBUG nova.scheduler.client.report [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 553.865107] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.461s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 553.865465] env[60764]: DEBUG nova.compute.manager [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 553.874018] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.458s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 553.874018] env[60764]: INFO nova.compute.claims [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 553.916550] env[60764]: DEBUG nova.compute.utils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 553.917922] env[60764]: DEBUG nova.compute.manager [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 553.918112] env[60764]: DEBUG nova.network.neutron [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 553.935805] env[60764]: DEBUG nova.compute.manager [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 554.051701] env[60764]: DEBUG nova.compute.manager [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 554.097650] env[60764]: DEBUG nova.virt.hardware [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 554.098128] env[60764]: DEBUG nova.virt.hardware [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 554.098224] env[60764]: DEBUG nova.virt.hardware [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 554.098344] env[60764]: DEBUG nova.virt.hardware [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 554.098480] env[60764]: DEBUG nova.virt.hardware [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 554.098618] env[60764]: DEBUG nova.virt.hardware [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 554.098957] env[60764]: DEBUG nova.virt.hardware [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 554.098957] env[60764]: DEBUG nova.virt.hardware [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 554.101437] env[60764]: DEBUG nova.virt.hardware [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 554.101437] env[60764]: DEBUG nova.virt.hardware [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 554.101437] env[60764]: DEBUG nova.virt.hardware [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 554.103035] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5aca7392-8ecd-44cc-be57-666c44532a4e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 554.116399] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c38adcf-d30b-4297-bb9f-dabff22bce51 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 554.189111] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Acquiring lock "ff2ef5e9-f543-4592-9896-e2c75369a971" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 554.189111] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Lock "ff2ef5e9-f543-4592-9896-e2c75369a971" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 554.203327] env[60764]: DEBUG nova.compute.manager [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 554.220107] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3c5f8f2-0fad-4484-a074-334c017babe9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 554.231505] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a1c4414-e729-4a79-90e8-08eba2a2355a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 554.272950] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5579a773-f2ca-4f31-a3f3-c53d740aef01 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 554.281555] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fceb7394-e574-44b1-ba68-a4987418082d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 554.306825] env[60764]: DEBUG nova.compute.provider_tree [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 554.308943] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 554.318593] env[60764]: DEBUG nova.scheduler.client.report [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 554.341335] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.469s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 554.341661] env[60764]: DEBUG nova.compute.manager [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 554.347795] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.039s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 554.350032] env[60764]: INFO nova.compute.claims [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 554.416378] env[60764]: DEBUG nova.compute.utils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 554.417043] env[60764]: DEBUG nova.compute.manager [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 554.417257] env[60764]: DEBUG nova.network.neutron [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 554.427163] env[60764]: DEBUG nova.policy [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '63157f2531044bd7beb38c5c36bbe8f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '97a20b8b4419416081918b302c7ce395', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 554.437388] env[60764]: DEBUG nova.compute.manager [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 554.580728] env[60764]: DEBUG nova.compute.manager [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 554.604853] env[60764]: DEBUG nova.policy [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b173f063ab06496bb437e33ace66d5e1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8edb262363a0404d8d18977a4e4c9765', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 554.621687] env[60764]: DEBUG nova.virt.hardware [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 554.621915] env[60764]: DEBUG nova.virt.hardware [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 554.622119] env[60764]: DEBUG nova.virt.hardware [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 554.622539] env[60764]: DEBUG nova.virt.hardware [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 554.622539] env[60764]: DEBUG nova.virt.hardware [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 554.622654] env[60764]: DEBUG nova.virt.hardware [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 554.623102] env[60764]: DEBUG nova.virt.hardware [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 554.623323] env[60764]: DEBUG nova.virt.hardware [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 554.623952] env[60764]: DEBUG nova.virt.hardware [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 554.623952] env[60764]: DEBUG nova.virt.hardware [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 554.623952] env[60764]: DEBUG nova.virt.hardware [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 554.624955] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6300c53a-d09d-4eac-bc90-8d1fbb9f8048 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 554.638361] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d52d3e7f-d8d6-4a3d-a4f6-5179d5bb20cc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 554.678979] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-158544ba-b7b9-410c-8127-7d23e961df86 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 554.689032] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5d86290-e76c-408f-97f9-78b9b2e4f185 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 554.739195] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38d52f08-5ceb-4a48-b1f0-dec934175c42 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 554.751152] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a27b62b4-1c17-4dd4-b89b-064f3f1f390b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 554.764648] env[60764]: DEBUG nova.compute.provider_tree [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 554.778983] env[60764]: DEBUG nova.scheduler.client.report [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 554.805084] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.457s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 554.806024] env[60764]: DEBUG nova.compute.manager [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 554.865152] env[60764]: DEBUG nova.compute.utils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 554.866642] env[60764]: DEBUG nova.compute.manager [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Not allocating networking since 'none' was specified. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 554.880229] env[60764]: DEBUG nova.compute.manager [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 554.968128] env[60764]: DEBUG nova.compute.manager [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 555.002802] env[60764]: DEBUG nova.virt.hardware [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 555.002802] env[60764]: DEBUG nova.virt.hardware [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 555.002802] env[60764]: DEBUG nova.virt.hardware [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 555.003413] env[60764]: DEBUG nova.virt.hardware [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 555.003413] env[60764]: DEBUG nova.virt.hardware [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 555.003413] env[60764]: DEBUG nova.virt.hardware [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 555.003413] env[60764]: DEBUG nova.virt.hardware [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 555.003413] env[60764]: DEBUG nova.virt.hardware [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 555.003620] env[60764]: DEBUG nova.virt.hardware [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 555.003620] env[60764]: DEBUG nova.virt.hardware [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 555.003620] env[60764]: DEBUG nova.virt.hardware [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 555.004819] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec3aae30-4607-4a1b-939b-a105b55e1457 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.014849] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-517e22ef-29ce-4765-804c-b5568d1836f3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.032384] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Instance VIF info [] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 555.041858] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Creating folder: Project (2ad5fac7984f4427a47b1d9551b248ac). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 555.042608] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-210f49b1-cd51-429d-9a18-6da14ef10786 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.055393] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Created folder: Project (2ad5fac7984f4427a47b1d9551b248ac) in parent group-v449629. [ 555.055393] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Creating folder: Instances. Parent ref: group-v449651. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 555.055393] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-08dc613e-1287-4648-aca2-b589967f2548 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.065325] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Created folder: Instances in parent group-v449651. [ 555.068347] env[60764]: DEBUG oslo.service.loopingcall [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 555.068347] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 555.068347] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9b01c464-8edc-4521-bd0b-bdcb278ebf5f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.084567] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 555.084567] env[60764]: value = "task-2204844" [ 555.084567] env[60764]: _type = "Task" [ 555.084567] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 555.092381] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204844, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 555.241992] env[60764]: DEBUG nova.compute.manager [req-fe8f8abd-fc99-4e32-b589-bd959d2a6ef8 req-a55d4267-54c6-4bb6-af7c-204056f62b30 service nova] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Received event network-vif-plugged-7eb10c62-134f-4f3a-97de-e994655830d6 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 555.242849] env[60764]: DEBUG oslo_concurrency.lockutils [req-fe8f8abd-fc99-4e32-b589-bd959d2a6ef8 req-a55d4267-54c6-4bb6-af7c-204056f62b30 service nova] Acquiring lock "bea83327-9479-46b2-bd78-c81d72359e8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 555.242849] env[60764]: DEBUG oslo_concurrency.lockutils [req-fe8f8abd-fc99-4e32-b589-bd959d2a6ef8 req-a55d4267-54c6-4bb6-af7c-204056f62b30 service nova] Lock "bea83327-9479-46b2-bd78-c81d72359e8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 555.242959] env[60764]: DEBUG oslo_concurrency.lockutils [req-fe8f8abd-fc99-4e32-b589-bd959d2a6ef8 req-a55d4267-54c6-4bb6-af7c-204056f62b30 service nova] Lock "bea83327-9479-46b2-bd78-c81d72359e8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 555.246207] env[60764]: DEBUG nova.compute.manager [req-fe8f8abd-fc99-4e32-b589-bd959d2a6ef8 req-a55d4267-54c6-4bb6-af7c-204056f62b30 service nova] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] No waiting events found dispatching network-vif-plugged-7eb10c62-134f-4f3a-97de-e994655830d6 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 555.246436] env[60764]: WARNING nova.compute.manager [req-fe8f8abd-fc99-4e32-b589-bd959d2a6ef8 req-a55d4267-54c6-4bb6-af7c-204056f62b30 service nova] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Received unexpected event network-vif-plugged-7eb10c62-134f-4f3a-97de-e994655830d6 for instance with vm_state building and task_state spawning. [ 555.246639] env[60764]: DEBUG nova.compute.manager [req-fe8f8abd-fc99-4e32-b589-bd959d2a6ef8 req-a55d4267-54c6-4bb6-af7c-204056f62b30 service nova] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Received event network-changed-7eb10c62-134f-4f3a-97de-e994655830d6 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 555.247237] env[60764]: DEBUG nova.compute.manager [req-fe8f8abd-fc99-4e32-b589-bd959d2a6ef8 req-a55d4267-54c6-4bb6-af7c-204056f62b30 service nova] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Refreshing instance network info cache due to event network-changed-7eb10c62-134f-4f3a-97de-e994655830d6. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 555.247237] env[60764]: DEBUG oslo_concurrency.lockutils [req-fe8f8abd-fc99-4e32-b589-bd959d2a6ef8 req-a55d4267-54c6-4bb6-af7c-204056f62b30 service nova] Acquiring lock "refresh_cache-bea83327-9479-46b2-bd78-c81d72359e8a" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 555.247237] env[60764]: DEBUG oslo_concurrency.lockutils [req-fe8f8abd-fc99-4e32-b589-bd959d2a6ef8 req-a55d4267-54c6-4bb6-af7c-204056f62b30 service nova] Acquired lock "refresh_cache-bea83327-9479-46b2-bd78-c81d72359e8a" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 555.247353] env[60764]: DEBUG nova.network.neutron [req-fe8f8abd-fc99-4e32-b589-bd959d2a6ef8 req-a55d4267-54c6-4bb6-af7c-204056f62b30 service nova] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Refreshing network info cache for port 7eb10c62-134f-4f3a-97de-e994655830d6 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 556.160615] env[60764]: DEBUG nova.network.neutron [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Successfully created port: 6afec9c7-c317-41ff-8876-bb9c8ab9e6b3 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 556.163223] env[60764]: DEBUG nova.network.neutron [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Successfully created port: 0cdf5f77-1ee8-4b0c-9740-f1d59327ca89 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 556.170568] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204844, 'name': CreateVM_Task} progress is 99%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 556.170772] env[60764]: WARNING oslo_vmware.common.loopingcall [-] task run outlasted interval by 0.084681 sec [ 556.179114] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204844, 'name': CreateVM_Task} progress is 99%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 556.684587] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204844, 'name': CreateVM_Task, 'duration_secs': 1.330195} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 556.684807] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 556.685285] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 556.685438] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 556.685756] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 556.686017] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ca749b94-8290-46f7-85b2-af0516922725 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.690808] env[60764]: DEBUG oslo_vmware.api [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Waiting for the task: (returnval){ [ 556.690808] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52034e9f-4030-b300-d161-007243daa445" [ 556.690808] env[60764]: _type = "Task" [ 556.690808] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 556.700583] env[60764]: DEBUG oslo_vmware.api [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52034e9f-4030-b300-d161-007243daa445, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 557.206806] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 557.206806] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 557.207199] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 557.254184] env[60764]: DEBUG nova.network.neutron [req-fe8f8abd-fc99-4e32-b589-bd959d2a6ef8 req-a55d4267-54c6-4bb6-af7c-204056f62b30 service nova] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Updated VIF entry in instance network info cache for port 7eb10c62-134f-4f3a-97de-e994655830d6. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 557.254583] env[60764]: DEBUG nova.network.neutron [req-fe8f8abd-fc99-4e32-b589-bd959d2a6ef8 req-a55d4267-54c6-4bb6-af7c-204056f62b30 service nova] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Updating instance_info_cache with network_info: [{"id": "7eb10c62-134f-4f3a-97de-e994655830d6", "address": "fa:16:3e:51:0b:18", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.131", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7eb10c62-13", "ovs_interfaceid": "7eb10c62-134f-4f3a-97de-e994655830d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 557.272243] env[60764]: DEBUG oslo_concurrency.lockutils [req-fe8f8abd-fc99-4e32-b589-bd959d2a6ef8 req-a55d4267-54c6-4bb6-af7c-204056f62b30 service nova] Releasing lock "refresh_cache-bea83327-9479-46b2-bd78-c81d72359e8a" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 557.745520] env[60764]: DEBUG nova.network.neutron [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Successfully updated port: 6afec9c7-c317-41ff-8876-bb9c8ab9e6b3 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 557.761214] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Acquiring lock "refresh_cache-4a0b0a82-3910-4201-b1f7-34c862667e3c" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 557.761214] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Acquired lock "refresh_cache-4a0b0a82-3910-4201-b1f7-34c862667e3c" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 557.761214] env[60764]: DEBUG nova.network.neutron [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 557.803441] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Acquiring lock "b846d9ae-759a-4898-9ede-091819325701" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 557.803656] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Lock "b846d9ae-759a-4898-9ede-091819325701" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 557.868614] env[60764]: DEBUG nova.network.neutron [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 558.445473] env[60764]: DEBUG nova.network.neutron [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Updating instance_info_cache with network_info: [{"id": "6afec9c7-c317-41ff-8876-bb9c8ab9e6b3", "address": "fa:16:3e:16:a8:33", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.122", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6afec9c7-c3", "ovs_interfaceid": "6afec9c7-c317-41ff-8876-bb9c8ab9e6b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 558.460784] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Releasing lock "refresh_cache-4a0b0a82-3910-4201-b1f7-34c862667e3c" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 558.461197] env[60764]: DEBUG nova.compute.manager [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Instance network_info: |[{"id": "6afec9c7-c317-41ff-8876-bb9c8ab9e6b3", "address": "fa:16:3e:16:a8:33", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.122", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6afec9c7-c3", "ovs_interfaceid": "6afec9c7-c317-41ff-8876-bb9c8ab9e6b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 558.461547] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:16:a8:33', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9f87a752-ebb0-49a4-a67b-e356fa45b89b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6afec9c7-c317-41ff-8876-bb9c8ab9e6b3', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 558.470676] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Creating folder: Project (8edb262363a0404d8d18977a4e4c9765). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 558.471639] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ba71434b-4a6f-45f5-aab4-a8d619374c6a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 558.482332] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Created folder: Project (8edb262363a0404d8d18977a4e4c9765) in parent group-v449629. [ 558.482332] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Creating folder: Instances. Parent ref: group-v449654. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 558.482425] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f86b2d32-b824-4220-851e-d6d946b47932 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 558.491704] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Created folder: Instances in parent group-v449654. [ 558.491960] env[60764]: DEBUG oslo.service.loopingcall [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 558.492909] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 558.492909] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-918f71cf-3fa0-4d16-be3c-40c7b187355c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 558.514872] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 558.514872] env[60764]: value = "task-2204847" [ 558.514872] env[60764]: _type = "Task" [ 558.514872] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 558.528419] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204847, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 558.693418] env[60764]: DEBUG nova.network.neutron [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Successfully updated port: 0cdf5f77-1ee8-4b0c-9740-f1d59327ca89 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 558.713332] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Acquiring lock "refresh_cache-7957eb49-d540-4c4e-a86a-1ea3631fb5ef" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 558.713332] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Acquired lock "refresh_cache-7957eb49-d540-4c4e-a86a-1ea3631fb5ef" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 558.713332] env[60764]: DEBUG nova.network.neutron [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 558.808998] env[60764]: DEBUG nova.network.neutron [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 559.027721] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204847, 'name': CreateVM_Task, 'duration_secs': 0.420481} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 559.027721] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 559.027721] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 559.027721] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 559.027721] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 559.028076] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3fc5a2f9-515a-43d0-ad9f-cfe0a637e74f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.034368] env[60764]: DEBUG oslo_vmware.api [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Waiting for the task: (returnval){ [ 559.034368] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52190348-fec9-ec8b-12b0-464332240627" [ 559.034368] env[60764]: _type = "Task" [ 559.034368] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 559.048934] env[60764]: DEBUG oslo_vmware.api [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52190348-fec9-ec8b-12b0-464332240627, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 559.077142] env[60764]: DEBUG nova.network.neutron [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Updating instance_info_cache with network_info: [{"id": "0cdf5f77-1ee8-4b0c-9740-f1d59327ca89", "address": "fa:16:3e:38:22:b5", "network": {"id": "ed092c70-eb40-4eca-8238-03bab9f7778b", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1372403250-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "97a20b8b4419416081918b302c7ce395", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7d689fd7-f53e-4fd3-80d9-8d6b8fb7a164", "external-id": "nsx-vlan-transportzone-972", "segmentation_id": 972, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0cdf5f77-1e", "ovs_interfaceid": "0cdf5f77-1ee8-4b0c-9740-f1d59327ca89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 559.095502] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Releasing lock "refresh_cache-7957eb49-d540-4c4e-a86a-1ea3631fb5ef" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 559.095828] env[60764]: DEBUG nova.compute.manager [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Instance network_info: |[{"id": "0cdf5f77-1ee8-4b0c-9740-f1d59327ca89", "address": "fa:16:3e:38:22:b5", "network": {"id": "ed092c70-eb40-4eca-8238-03bab9f7778b", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1372403250-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "97a20b8b4419416081918b302c7ce395", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7d689fd7-f53e-4fd3-80d9-8d6b8fb7a164", "external-id": "nsx-vlan-transportzone-972", "segmentation_id": 972, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0cdf5f77-1e", "ovs_interfaceid": "0cdf5f77-1ee8-4b0c-9740-f1d59327ca89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 559.096252] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:38:22:b5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7d689fd7-f53e-4fd3-80d9-8d6b8fb7a164', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0cdf5f77-1ee8-4b0c-9740-f1d59327ca89', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 559.104082] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Creating folder: Project (97a20b8b4419416081918b302c7ce395). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 559.104706] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d8508f70-239d-461f-9d25-e5f7e4a18b76 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.116605] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Created folder: Project (97a20b8b4419416081918b302c7ce395) in parent group-v449629. [ 559.116802] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Creating folder: Instances. Parent ref: group-v449657. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 559.117043] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-efda6b68-97ab-4c89-a0c0-013a90b87bfb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.127730] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Created folder: Instances in parent group-v449657. [ 559.127965] env[60764]: DEBUG oslo.service.loopingcall [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 559.128165] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 559.128364] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b83fb2c2-9d36-4c7c-952f-9e3aab065702 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.150487] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 559.150487] env[60764]: value = "task-2204850" [ 559.150487] env[60764]: _type = "Task" [ 559.150487] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 559.159067] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204850, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 559.551534] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 559.551899] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 559.552150] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 559.609510] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Acquiring lock "4e82aa9c-ae76-4b49-b666-5d5adc22d1b5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 559.609823] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Lock "4e82aa9c-ae76-4b49-b666-5d5adc22d1b5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 559.661476] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204850, 'name': CreateVM_Task, 'duration_secs': 0.381372} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 559.661592] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 559.662376] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 559.663207] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 559.663207] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 559.663207] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6f234809-325c-45f1-8bf7-ba69d6f60136 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.668292] env[60764]: DEBUG oslo_vmware.api [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Waiting for the task: (returnval){ [ 559.668292] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52ed7a91-a0ec-e615-a67c-12adc4ca4bdc" [ 559.668292] env[60764]: _type = "Task" [ 559.668292] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 559.677612] env[60764]: DEBUG oslo_vmware.api [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52ed7a91-a0ec-e615-a67c-12adc4ca4bdc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 560.180451] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 560.180706] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 560.180917] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 560.187789] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Acquiring lock "437e0c0d-6d0e-4465-9651-14e420b646ae" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 560.188052] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Lock "437e0c0d-6d0e-4465-9651-14e420b646ae" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 560.944643] env[60764]: DEBUG nova.compute.manager [req-575dcb75-8585-4b04-b541-fe2b283387c3 req-5174f60b-0588-4450-bcfd-89a3418c435e service nova] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Received event network-vif-plugged-6afec9c7-c317-41ff-8876-bb9c8ab9e6b3 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 560.944899] env[60764]: DEBUG oslo_concurrency.lockutils [req-575dcb75-8585-4b04-b541-fe2b283387c3 req-5174f60b-0588-4450-bcfd-89a3418c435e service nova] Acquiring lock "4a0b0a82-3910-4201-b1f7-34c862667e3c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 560.945078] env[60764]: DEBUG oslo_concurrency.lockutils [req-575dcb75-8585-4b04-b541-fe2b283387c3 req-5174f60b-0588-4450-bcfd-89a3418c435e service nova] Lock "4a0b0a82-3910-4201-b1f7-34c862667e3c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 560.945256] env[60764]: DEBUG oslo_concurrency.lockutils [req-575dcb75-8585-4b04-b541-fe2b283387c3 req-5174f60b-0588-4450-bcfd-89a3418c435e service nova] Lock "4a0b0a82-3910-4201-b1f7-34c862667e3c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 560.945420] env[60764]: DEBUG nova.compute.manager [req-575dcb75-8585-4b04-b541-fe2b283387c3 req-5174f60b-0588-4450-bcfd-89a3418c435e service nova] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] No waiting events found dispatching network-vif-plugged-6afec9c7-c317-41ff-8876-bb9c8ab9e6b3 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 560.947247] env[60764]: WARNING nova.compute.manager [req-575dcb75-8585-4b04-b541-fe2b283387c3 req-5174f60b-0588-4450-bcfd-89a3418c435e service nova] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Received unexpected event network-vif-plugged-6afec9c7-c317-41ff-8876-bb9c8ab9e6b3 for instance with vm_state building and task_state spawning. [ 561.125695] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Acquiring lock "1f11c625-166f-4609-badf-da4dd9475c37" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 561.126138] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Lock "1f11c625-166f-4609-badf-da4dd9475c37" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 561.812589] env[60764]: DEBUG nova.compute.manager [req-afefa5be-617a-47f1-aada-ec0be9c19ce6 req-a57702b1-9f0a-4494-95c2-78f0f5b4a546 service nova] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Received event network-vif-plugged-0cdf5f77-1ee8-4b0c-9740-f1d59327ca89 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 561.812589] env[60764]: DEBUG oslo_concurrency.lockutils [req-afefa5be-617a-47f1-aada-ec0be9c19ce6 req-a57702b1-9f0a-4494-95c2-78f0f5b4a546 service nova] Acquiring lock "7957eb49-d540-4c4e-a86a-1ea3631fb5ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 561.812589] env[60764]: DEBUG oslo_concurrency.lockutils [req-afefa5be-617a-47f1-aada-ec0be9c19ce6 req-a57702b1-9f0a-4494-95c2-78f0f5b4a546 service nova] Lock "7957eb49-d540-4c4e-a86a-1ea3631fb5ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 561.812589] env[60764]: DEBUG oslo_concurrency.lockutils [req-afefa5be-617a-47f1-aada-ec0be9c19ce6 req-a57702b1-9f0a-4494-95c2-78f0f5b4a546 service nova] Lock "7957eb49-d540-4c4e-a86a-1ea3631fb5ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 561.812774] env[60764]: DEBUG nova.compute.manager [req-afefa5be-617a-47f1-aada-ec0be9c19ce6 req-a57702b1-9f0a-4494-95c2-78f0f5b4a546 service nova] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] No waiting events found dispatching network-vif-plugged-0cdf5f77-1ee8-4b0c-9740-f1d59327ca89 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 561.812774] env[60764]: WARNING nova.compute.manager [req-afefa5be-617a-47f1-aada-ec0be9c19ce6 req-a57702b1-9f0a-4494-95c2-78f0f5b4a546 service nova] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Received unexpected event network-vif-plugged-0cdf5f77-1ee8-4b0c-9740-f1d59327ca89 for instance with vm_state building and task_state spawning. [ 564.165496] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40a3af0c-0c78-47a5-b816-503530f06d87 tempest-VolumesAssistedSnapshotsTest-1009492998 tempest-VolumesAssistedSnapshotsTest-1009492998-project-member] Acquiring lock "7a378ea2-b981-443a-a925-9819ac5b979f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 564.165821] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40a3af0c-0c78-47a5-b816-503530f06d87 tempest-VolumesAssistedSnapshotsTest-1009492998 tempest-VolumesAssistedSnapshotsTest-1009492998-project-member] Lock "7a378ea2-b981-443a-a925-9819ac5b979f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 564.649301] env[60764]: DEBUG nova.compute.manager [req-8edd2055-034e-4459-9d58-7c2d24bf4617 req-cf0f600b-c7d9-42d1-88ed-edfb053b707b service nova] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Received event network-changed-6afec9c7-c317-41ff-8876-bb9c8ab9e6b3 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 564.649578] env[60764]: DEBUG nova.compute.manager [req-8edd2055-034e-4459-9d58-7c2d24bf4617 req-cf0f600b-c7d9-42d1-88ed-edfb053b707b service nova] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Refreshing instance network info cache due to event network-changed-6afec9c7-c317-41ff-8876-bb9c8ab9e6b3. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 564.649690] env[60764]: DEBUG oslo_concurrency.lockutils [req-8edd2055-034e-4459-9d58-7c2d24bf4617 req-cf0f600b-c7d9-42d1-88ed-edfb053b707b service nova] Acquiring lock "refresh_cache-4a0b0a82-3910-4201-b1f7-34c862667e3c" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 564.650237] env[60764]: DEBUG oslo_concurrency.lockutils [req-8edd2055-034e-4459-9d58-7c2d24bf4617 req-cf0f600b-c7d9-42d1-88ed-edfb053b707b service nova] Acquired lock "refresh_cache-4a0b0a82-3910-4201-b1f7-34c862667e3c" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 564.650237] env[60764]: DEBUG nova.network.neutron [req-8edd2055-034e-4459-9d58-7c2d24bf4617 req-cf0f600b-c7d9-42d1-88ed-edfb053b707b service nova] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Refreshing network info cache for port 6afec9c7-c317-41ff-8876-bb9c8ab9e6b3 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 565.140305] env[60764]: DEBUG nova.network.neutron [req-8edd2055-034e-4459-9d58-7c2d24bf4617 req-cf0f600b-c7d9-42d1-88ed-edfb053b707b service nova] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Updated VIF entry in instance network info cache for port 6afec9c7-c317-41ff-8876-bb9c8ab9e6b3. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 565.140650] env[60764]: DEBUG nova.network.neutron [req-8edd2055-034e-4459-9d58-7c2d24bf4617 req-cf0f600b-c7d9-42d1-88ed-edfb053b707b service nova] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Updating instance_info_cache with network_info: [{"id": "6afec9c7-c317-41ff-8876-bb9c8ab9e6b3", "address": "fa:16:3e:16:a8:33", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.122", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6afec9c7-c3", "ovs_interfaceid": "6afec9c7-c317-41ff-8876-bb9c8ab9e6b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 565.151022] env[60764]: DEBUG oslo_concurrency.lockutils [req-8edd2055-034e-4459-9d58-7c2d24bf4617 req-cf0f600b-c7d9-42d1-88ed-edfb053b707b service nova] Releasing lock "refresh_cache-4a0b0a82-3910-4201-b1f7-34c862667e3c" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 565.478746] env[60764]: DEBUG nova.compute.manager [req-44638382-b0d8-4820-ae95-54e5c7f53c29 req-5ccc9e22-7898-4a79-a949-d2d0fde75eb8 service nova] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Received event network-changed-0cdf5f77-1ee8-4b0c-9740-f1d59327ca89 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 565.479012] env[60764]: DEBUG nova.compute.manager [req-44638382-b0d8-4820-ae95-54e5c7f53c29 req-5ccc9e22-7898-4a79-a949-d2d0fde75eb8 service nova] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Refreshing instance network info cache due to event network-changed-0cdf5f77-1ee8-4b0c-9740-f1d59327ca89. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 565.483314] env[60764]: DEBUG oslo_concurrency.lockutils [req-44638382-b0d8-4820-ae95-54e5c7f53c29 req-5ccc9e22-7898-4a79-a949-d2d0fde75eb8 service nova] Acquiring lock "refresh_cache-7957eb49-d540-4c4e-a86a-1ea3631fb5ef" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 565.483497] env[60764]: DEBUG oslo_concurrency.lockutils [req-44638382-b0d8-4820-ae95-54e5c7f53c29 req-5ccc9e22-7898-4a79-a949-d2d0fde75eb8 service nova] Acquired lock "refresh_cache-7957eb49-d540-4c4e-a86a-1ea3631fb5ef" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 565.483670] env[60764]: DEBUG nova.network.neutron [req-44638382-b0d8-4820-ae95-54e5c7f53c29 req-5ccc9e22-7898-4a79-a949-d2d0fde75eb8 service nova] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Refreshing network info cache for port 0cdf5f77-1ee8-4b0c-9740-f1d59327ca89 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 565.877773] env[60764]: DEBUG nova.network.neutron [req-44638382-b0d8-4820-ae95-54e5c7f53c29 req-5ccc9e22-7898-4a79-a949-d2d0fde75eb8 service nova] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Updated VIF entry in instance network info cache for port 0cdf5f77-1ee8-4b0c-9740-f1d59327ca89. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 565.878137] env[60764]: DEBUG nova.network.neutron [req-44638382-b0d8-4820-ae95-54e5c7f53c29 req-5ccc9e22-7898-4a79-a949-d2d0fde75eb8 service nova] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Updating instance_info_cache with network_info: [{"id": "0cdf5f77-1ee8-4b0c-9740-f1d59327ca89", "address": "fa:16:3e:38:22:b5", "network": {"id": "ed092c70-eb40-4eca-8238-03bab9f7778b", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1372403250-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "97a20b8b4419416081918b302c7ce395", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7d689fd7-f53e-4fd3-80d9-8d6b8fb7a164", "external-id": "nsx-vlan-transportzone-972", "segmentation_id": 972, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0cdf5f77-1e", "ovs_interfaceid": "0cdf5f77-1ee8-4b0c-9740-f1d59327ca89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 565.890100] env[60764]: DEBUG oslo_concurrency.lockutils [req-44638382-b0d8-4820-ae95-54e5c7f53c29 req-5ccc9e22-7898-4a79-a949-d2d0fde75eb8 service nova] Releasing lock "refresh_cache-7957eb49-d540-4c4e-a86a-1ea3631fb5ef" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 566.851305] env[60764]: DEBUG oslo_concurrency.lockutils [None req-027ed7af-4498-4459-884b-21717a6ec8ef tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "72e8a8e0-6229-4557-889b-73851a13dbc1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 566.851596] env[60764]: DEBUG oslo_concurrency.lockutils [None req-027ed7af-4498-4459-884b-21717a6ec8ef tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "72e8a8e0-6229-4557-889b-73851a13dbc1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 567.534417] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0a95c3d4-f4c1-4317-8f29-d690850754f8 tempest-ImagesNegativeTestJSON-1879878930 tempest-ImagesNegativeTestJSON-1879878930-project-member] Acquiring lock "586d9ca2-c287-49bc-bf61-a5140ceaddea" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 567.534739] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0a95c3d4-f4c1-4317-8f29-d690850754f8 tempest-ImagesNegativeTestJSON-1879878930 tempest-ImagesNegativeTestJSON-1879878930-project-member] Lock "586d9ca2-c287-49bc-bf61-a5140ceaddea" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 568.774774] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b9285dae-455e-4eb2-ae5b-72bd4e60b168 tempest-AttachInterfacesUnderV243Test-1999804320 tempest-AttachInterfacesUnderV243Test-1999804320-project-member] Acquiring lock "72225dcc-d210-49cf-8395-056bf4f7f652" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 568.775511] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b9285dae-455e-4eb2-ae5b-72bd4e60b168 tempest-AttachInterfacesUnderV243Test-1999804320 tempest-AttachInterfacesUnderV243Test-1999804320-project-member] Lock "72225dcc-d210-49cf-8395-056bf4f7f652" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 578.070414] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25dbeba2-7296-40d9-acb6-05202f9bb1da tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Acquiring lock "7d8b4524-f867-4605-a468-a1c39e77dabd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 578.070907] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25dbeba2-7296-40d9-acb6-05202f9bb1da tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Lock "7d8b4524-f867-4605-a468-a1c39e77dabd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 579.509621] env[60764]: DEBUG oslo_concurrency.lockutils [None req-48384ae7-d525-4af3-8a30-c942e3265f51 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Acquiring lock "59c6c887-b973-4a03-ba64-4ad12be45f64" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 579.509910] env[60764]: DEBUG oslo_concurrency.lockutils [None req-48384ae7-d525-4af3-8a30-c942e3265f51 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "59c6c887-b973-4a03-ba64-4ad12be45f64" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 583.708084] env[60764]: DEBUG oslo_concurrency.lockutils [None req-fdedc5e8-8f91-4526-a378-611a505be85d tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Acquiring lock "b154a29f-564d-449d-8321-8ded1d4ec29b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 583.708414] env[60764]: DEBUG oslo_concurrency.lockutils [None req-fdedc5e8-8f91-4526-a378-611a505be85d tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Lock "b154a29f-564d-449d-8321-8ded1d4ec29b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 585.360659] env[60764]: DEBUG oslo_concurrency.lockutils [None req-79ee51a8-a915-41e4-83df-367e0bef2851 tempest-InstanceActionsV221TestJSON-2132129667 tempest-InstanceActionsV221TestJSON-2132129667-project-member] Acquiring lock "330f33a1-cc70-4346-b6c5-e26720ed72f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.360977] env[60764]: DEBUG oslo_concurrency.lockutils [None req-79ee51a8-a915-41e4-83df-367e0bef2851 tempest-InstanceActionsV221TestJSON-2132129667 tempest-InstanceActionsV221TestJSON-2132129667-project-member] Lock "330f33a1-cc70-4346-b6c5-e26720ed72f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 585.602348] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bc55a72e-1728-4f37-8111-89c2f67847de tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Acquiring lock "b24a1521-7fcb-4369-ab8e-211690508a67" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.602578] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bc55a72e-1728-4f37-8111-89c2f67847de tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Lock "b24a1521-7fcb-4369-ab8e-211690508a67" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 588.859033] env[60764]: WARNING oslo_vmware.rw_handles [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 588.859033] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 588.859033] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 588.859033] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 588.859033] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 588.859033] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 588.859033] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 588.859033] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 588.859033] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 588.859033] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 588.859033] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 588.859033] env[60764]: ERROR oslo_vmware.rw_handles [ 588.859033] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/90579739-38cc-46f1-8ae9-1af966a8f3d3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 588.859810] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 588.859810] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Copying Virtual Disk [datastore2] vmware_temp/90579739-38cc-46f1-8ae9-1af966a8f3d3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/90579739-38cc-46f1-8ae9-1af966a8f3d3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 588.859810] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-940251d4-b1f3-472a-90ab-90bcdcc4bd65 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 588.872671] env[60764]: DEBUG oslo_vmware.api [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Waiting for the task: (returnval){ [ 588.872671] env[60764]: value = "task-2204855" [ 588.872671] env[60764]: _type = "Task" [ 588.872671] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 588.884211] env[60764]: DEBUG oslo_vmware.api [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Task: {'id': task-2204855, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 589.317859] env[60764]: DEBUG oslo_concurrency.lockutils [None req-4e4a1568-531d-44a0-9c82-d6ee4b410c79 tempest-ServersTestManualDisk-764844659 tempest-ServersTestManualDisk-764844659-project-member] Acquiring lock "130cd4e5-9df4-4428-abaa-d937d73d9950" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 589.318606] env[60764]: DEBUG oslo_concurrency.lockutils [None req-4e4a1568-531d-44a0-9c82-d6ee4b410c79 tempest-ServersTestManualDisk-764844659 tempest-ServersTestManualDisk-764844659-project-member] Lock "130cd4e5-9df4-4428-abaa-d937d73d9950" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 589.389209] env[60764]: DEBUG oslo_vmware.exceptions [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 589.389209] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 589.393550] env[60764]: ERROR nova.compute.manager [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 589.393550] env[60764]: Faults: ['InvalidArgument'] [ 589.393550] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Traceback (most recent call last): [ 589.393550] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 589.393550] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] yield resources [ 589.393550] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 589.393550] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] self.driver.spawn(context, instance, image_meta, [ 589.393550] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 589.393550] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] self._vmops.spawn(context, instance, image_meta, injected_files, [ 589.393550] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 589.393550] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] self._fetch_image_if_missing(context, vi) [ 589.393550] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 589.393976] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] image_cache(vi, tmp_image_ds_loc) [ 589.393976] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 589.393976] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] vm_util.copy_virtual_disk( [ 589.393976] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 589.393976] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] session._wait_for_task(vmdk_copy_task) [ 589.393976] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 589.393976] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] return self.wait_for_task(task_ref) [ 589.393976] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 589.393976] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] return evt.wait() [ 589.393976] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 589.393976] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] result = hub.switch() [ 589.393976] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 589.393976] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] return self.greenlet.switch() [ 589.394358] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 589.394358] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] self.f(*self.args, **self.kw) [ 589.394358] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 589.394358] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] raise exceptions.translate_fault(task_info.error) [ 589.394358] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 589.394358] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Faults: ['InvalidArgument'] [ 589.394358] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] [ 589.394358] env[60764]: INFO nova.compute.manager [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Terminating instance [ 589.396304] env[60764]: DEBUG nova.compute.manager [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 589.396502] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 589.396926] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 589.397155] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 589.398860] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5815115-3db2-4514-87f0-25257fdfc4ed {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.403528] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2413c38e-f102-4710-87bb-3f50da82987b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.413780] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 589.413875] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3f466c2c-2be1-4721-9823-dec38654ffca {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.415926] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 589.416055] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 589.416980] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fc6007a1-901b-4b35-9a42-d1a67d14c28e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.422456] env[60764]: DEBUG oslo_vmware.api [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Waiting for the task: (returnval){ [ 589.422456] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]522de732-cb5c-c909-3eec-bce672d79662" [ 589.422456] env[60764]: _type = "Task" [ 589.422456] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 589.433717] env[60764]: DEBUG oslo_vmware.api [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]522de732-cb5c-c909-3eec-bce672d79662, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 589.492436] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 589.494058] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 589.494058] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Deleting the datastore file [datastore2] 116f360a-6080-46c7-8234-69fe54b9a147 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 589.494058] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3079fb7f-2aaf-40ca-bd6d-3b0ca5a42706 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.503755] env[60764]: DEBUG oslo_vmware.api [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Waiting for the task: (returnval){ [ 589.503755] env[60764]: value = "task-2204858" [ 589.503755] env[60764]: _type = "Task" [ 589.503755] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 589.516441] env[60764]: DEBUG oslo_vmware.api [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Task: {'id': task-2204858, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 589.933677] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 589.933931] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Creating directory with path [datastore2] vmware_temp/fbce997e-56fe-4ae2-96f7-3dbfd94a77cf/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 589.933974] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b78d3563-dbc9-4877-8762-9f639d9e2a68 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.946497] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Created directory with path [datastore2] vmware_temp/fbce997e-56fe-4ae2-96f7-3dbfd94a77cf/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 589.946708] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Fetch image to [datastore2] vmware_temp/fbce997e-56fe-4ae2-96f7-3dbfd94a77cf/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 589.946873] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/fbce997e-56fe-4ae2-96f7-3dbfd94a77cf/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 589.948011] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18d3e061-9ab7-48af-af78-26b6b0c4235e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.956785] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8191d9ad-449d-469a-b9de-598a6355d943 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.970848] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-652c5ab0-4b66-4c29-bcf4-87dd7ee5cc30 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.017232] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84b803d5-e729-412b-a09e-5c7825c1fa35 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.028191] env[60764]: DEBUG oslo_vmware.api [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Task: {'id': task-2204858, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.198767} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 590.030364] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 590.032301] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 590.032301] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 590.032301] env[60764]: INFO nova.compute.manager [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Took 0.63 seconds to destroy the instance on the hypervisor. [ 590.032780] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4d377634-bac3-4643-9119-e54c5590b4dd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.035276] env[60764]: DEBUG nova.compute.claims [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 590.035583] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 590.035649] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 590.057689] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 590.137113] env[60764]: DEBUG oslo_vmware.rw_handles [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fbce997e-56fe-4ae2-96f7-3dbfd94a77cf/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 590.204652] env[60764]: DEBUG oslo_vmware.rw_handles [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 590.204861] env[60764]: DEBUG oslo_vmware.rw_handles [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fbce997e-56fe-4ae2-96f7-3dbfd94a77cf/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 590.560312] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e72e8d39-cb00-42e9-acd7-2b7d7c3f9f99 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.567388] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-582c3a9c-09cd-4d11-8b8c-169e60bf134e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.597439] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a257c33-a416-4863-8783-0ff5d92e55cd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.605060] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e34a41fd-1c71-41ae-b0e3-553aceca1c4b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.621277] env[60764]: DEBUG nova.compute.provider_tree [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 590.632984] env[60764]: DEBUG nova.scheduler.client.report [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 590.652853] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.617s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 590.653404] env[60764]: ERROR nova.compute.manager [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 590.653404] env[60764]: Faults: ['InvalidArgument'] [ 590.653404] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Traceback (most recent call last): [ 590.653404] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 590.653404] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] self.driver.spawn(context, instance, image_meta, [ 590.653404] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 590.653404] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] self._vmops.spawn(context, instance, image_meta, injected_files, [ 590.653404] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 590.653404] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] self._fetch_image_if_missing(context, vi) [ 590.653404] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 590.653404] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] image_cache(vi, tmp_image_ds_loc) [ 590.653404] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 590.653799] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] vm_util.copy_virtual_disk( [ 590.653799] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 590.653799] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] session._wait_for_task(vmdk_copy_task) [ 590.653799] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 590.653799] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] return self.wait_for_task(task_ref) [ 590.653799] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 590.653799] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] return evt.wait() [ 590.653799] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 590.653799] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] result = hub.switch() [ 590.653799] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 590.653799] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] return self.greenlet.switch() [ 590.653799] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 590.653799] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] self.f(*self.args, **self.kw) [ 590.654182] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 590.654182] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] raise exceptions.translate_fault(task_info.error) [ 590.654182] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 590.654182] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Faults: ['InvalidArgument'] [ 590.654182] env[60764]: ERROR nova.compute.manager [instance: 116f360a-6080-46c7-8234-69fe54b9a147] [ 590.654614] env[60764]: DEBUG nova.compute.utils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 590.658603] env[60764]: DEBUG nova.compute.manager [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Build of instance 116f360a-6080-46c7-8234-69fe54b9a147 was re-scheduled: A specified parameter was not correct: fileType [ 590.658603] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 590.658780] env[60764]: DEBUG nova.compute.manager [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 590.658949] env[60764]: DEBUG nova.compute.manager [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 590.659113] env[60764]: DEBUG nova.compute.manager [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 590.659276] env[60764]: DEBUG nova.network.neutron [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 591.362917] env[60764]: DEBUG nova.network.neutron [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 591.378564] env[60764]: INFO nova.compute.manager [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] [instance: 116f360a-6080-46c7-8234-69fe54b9a147] Took 0.72 seconds to deallocate network for instance. [ 591.468519] env[60764]: DEBUG oslo_concurrency.lockutils [None req-86d04512-f5cc-4259-83af-39ddeffd2e29 tempest-ServersAaction247Test-1656782549 tempest-ServersAaction247Test-1656782549-project-member] Acquiring lock "c80cde37-2b69-46f7-9ca6-bfa618a18b1e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 591.468763] env[60764]: DEBUG oslo_concurrency.lockutils [None req-86d04512-f5cc-4259-83af-39ddeffd2e29 tempest-ServersAaction247Test-1656782549 tempest-ServersAaction247Test-1656782549-project-member] Lock "c80cde37-2b69-46f7-9ca6-bfa618a18b1e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 591.517282] env[60764]: INFO nova.scheduler.client.report [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Deleted allocations for instance 116f360a-6080-46c7-8234-69fe54b9a147 [ 591.555402] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebf37e99-2707-4dc3-ac1f-67593fd003e9 tempest-TenantUsagesTestJSON-721646480 tempest-TenantUsagesTestJSON-721646480-project-member] Lock "116f360a-6080-46c7-8234-69fe54b9a147" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 57.048s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 591.594398] env[60764]: DEBUG nova.compute.manager [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 591.675866] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 591.676158] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 591.677896] env[60764]: INFO nova.compute.claims [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 592.129145] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-678a4cc7-ccf9-4dd6-8c5d-928841441175 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.138631] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bb9f391-bcf8-4a5d-95cf-250f6c82bcee {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.174428] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2bfdb39-062f-4bd8-917b-9519dd1f5b7b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.183959] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-580cb236-ad3e-4249-8b7b-da5120319792 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.201179] env[60764]: DEBUG nova.compute.provider_tree [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 592.210451] env[60764]: DEBUG nova.scheduler.client.report [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 592.229630] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.553s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 592.230197] env[60764]: DEBUG nova.compute.manager [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 592.271850] env[60764]: DEBUG nova.compute.utils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 592.273213] env[60764]: DEBUG nova.compute.manager [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 592.275898] env[60764]: DEBUG nova.network.neutron [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 592.290388] env[60764]: DEBUG nova.compute.manager [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 592.354131] env[60764]: DEBUG nova.policy [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'adbb7822347f4b9b8fa2fc63c9b72dd7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1287a3b6f504021baeedfadf0887bb8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 592.389196] env[60764]: DEBUG nova.compute.manager [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 592.418345] env[60764]: DEBUG nova.virt.hardware [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 592.418576] env[60764]: DEBUG nova.virt.hardware [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 592.418729] env[60764]: DEBUG nova.virt.hardware [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 592.418897] env[60764]: DEBUG nova.virt.hardware [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 592.419081] env[60764]: DEBUG nova.virt.hardware [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 592.419233] env[60764]: DEBUG nova.virt.hardware [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 592.419436] env[60764]: DEBUG nova.virt.hardware [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 592.419606] env[60764]: DEBUG nova.virt.hardware [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 592.419870] env[60764]: DEBUG nova.virt.hardware [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 592.420225] env[60764]: DEBUG nova.virt.hardware [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 592.420470] env[60764]: DEBUG nova.virt.hardware [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 592.421626] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c447eb6f-c784-4e89-b061-29fa5608a6a8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 592.432332] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acba93dd-51f5-45e2-86e2-5f7d0ad47d86 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 593.059093] env[60764]: DEBUG nova.network.neutron [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Successfully created port: 5c9195aa-a9fa-47b3-824d-d90372927066 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 593.463387] env[60764]: DEBUG oslo_concurrency.lockutils [None req-32d25acc-108a-4cee-987b-6f738651cc19 tempest-ServerAddressesTestJSON-1852641836 tempest-ServerAddressesTestJSON-1852641836-project-member] Acquiring lock "7e3c6624-3c29-4df0-b417-575976b2f0f7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 593.463387] env[60764]: DEBUG oslo_concurrency.lockutils [None req-32d25acc-108a-4cee-987b-6f738651cc19 tempest-ServerAddressesTestJSON-1852641836 tempest-ServerAddressesTestJSON-1852641836-project-member] Lock "7e3c6624-3c29-4df0-b417-575976b2f0f7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 593.984115] env[60764]: DEBUG nova.network.neutron [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Successfully updated port: 5c9195aa-a9fa-47b3-824d-d90372927066 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 593.997947] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Acquiring lock "refresh_cache-b846d9ae-759a-4898-9ede-091819325701" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 593.998102] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Acquired lock "refresh_cache-b846d9ae-759a-4898-9ede-091819325701" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 593.998248] env[60764]: DEBUG nova.network.neutron [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 594.054161] env[60764]: DEBUG nova.network.neutron [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 594.337511] env[60764]: DEBUG nova.network.neutron [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Updating instance_info_cache with network_info: [{"id": "5c9195aa-a9fa-47b3-824d-d90372927066", "address": "fa:16:3e:4f:c7:27", "network": {"id": "b3df541b-a60e-48f6-8a76-9aa06ec1904c", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-57555243-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b1287a3b6f504021baeedfadf0887bb8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6046aec4-feda-4ef9-bf4a-800de1e0cd3b", "external-id": "nsx-vlan-transportzone-903", "segmentation_id": 903, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c9195aa-a9", "ovs_interfaceid": "5c9195aa-a9fa-47b3-824d-d90372927066", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 594.355116] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Releasing lock "refresh_cache-b846d9ae-759a-4898-9ede-091819325701" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 594.355363] env[60764]: DEBUG nova.compute.manager [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Instance network_info: |[{"id": "5c9195aa-a9fa-47b3-824d-d90372927066", "address": "fa:16:3e:4f:c7:27", "network": {"id": "b3df541b-a60e-48f6-8a76-9aa06ec1904c", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-57555243-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b1287a3b6f504021baeedfadf0887bb8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6046aec4-feda-4ef9-bf4a-800de1e0cd3b", "external-id": "nsx-vlan-transportzone-903", "segmentation_id": 903, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c9195aa-a9", "ovs_interfaceid": "5c9195aa-a9fa-47b3-824d-d90372927066", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 594.356213] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:4f:c7:27', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6046aec4-feda-4ef9-bf4a-800de1e0cd3b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5c9195aa-a9fa-47b3-824d-d90372927066', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 594.366027] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Creating folder: Project (b1287a3b6f504021baeedfadf0887bb8). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 594.366027] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7e7786f7-4571-4fac-8892-413258af17ed {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.377525] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Created folder: Project (b1287a3b6f504021baeedfadf0887bb8) in parent group-v449629. [ 594.377735] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Creating folder: Instances. Parent ref: group-v449663. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 594.378023] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8e060b00-3c3a-44ea-ae9f-bc6574b5ac9e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.392624] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Created folder: Instances in parent group-v449663. [ 594.392876] env[60764]: DEBUG oslo.service.loopingcall [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 594.393082] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b846d9ae-759a-4898-9ede-091819325701] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 594.393297] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-51c8a159-45c8-463c-a0a6-8fca79e6fc50 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 594.415117] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 594.415117] env[60764]: value = "task-2204863" [ 594.415117] env[60764]: _type = "Task" [ 594.415117] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 594.425374] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204863, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 594.823334] env[60764]: DEBUG nova.compute.manager [req-222e1fd5-0a33-42f1-945b-2c229b17e6ba req-2be245f0-dc5f-4095-93e7-403584782f83 service nova] [instance: b846d9ae-759a-4898-9ede-091819325701] Received event network-vif-plugged-5c9195aa-a9fa-47b3-824d-d90372927066 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 594.823334] env[60764]: DEBUG oslo_concurrency.lockutils [req-222e1fd5-0a33-42f1-945b-2c229b17e6ba req-2be245f0-dc5f-4095-93e7-403584782f83 service nova] Acquiring lock "b846d9ae-759a-4898-9ede-091819325701-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 594.823601] env[60764]: DEBUG oslo_concurrency.lockutils [req-222e1fd5-0a33-42f1-945b-2c229b17e6ba req-2be245f0-dc5f-4095-93e7-403584782f83 service nova] Lock "b846d9ae-759a-4898-9ede-091819325701-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 594.823601] env[60764]: DEBUG oslo_concurrency.lockutils [req-222e1fd5-0a33-42f1-945b-2c229b17e6ba req-2be245f0-dc5f-4095-93e7-403584782f83 service nova] Lock "b846d9ae-759a-4898-9ede-091819325701-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 594.823717] env[60764]: DEBUG nova.compute.manager [req-222e1fd5-0a33-42f1-945b-2c229b17e6ba req-2be245f0-dc5f-4095-93e7-403584782f83 service nova] [instance: b846d9ae-759a-4898-9ede-091819325701] No waiting events found dispatching network-vif-plugged-5c9195aa-a9fa-47b3-824d-d90372927066 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 594.823855] env[60764]: WARNING nova.compute.manager [req-222e1fd5-0a33-42f1-945b-2c229b17e6ba req-2be245f0-dc5f-4095-93e7-403584782f83 service nova] [instance: b846d9ae-759a-4898-9ede-091819325701] Received unexpected event network-vif-plugged-5c9195aa-a9fa-47b3-824d-d90372927066 for instance with vm_state building and task_state spawning. [ 594.824018] env[60764]: DEBUG nova.compute.manager [req-222e1fd5-0a33-42f1-945b-2c229b17e6ba req-2be245f0-dc5f-4095-93e7-403584782f83 service nova] [instance: b846d9ae-759a-4898-9ede-091819325701] Received event network-changed-5c9195aa-a9fa-47b3-824d-d90372927066 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 594.824173] env[60764]: DEBUG nova.compute.manager [req-222e1fd5-0a33-42f1-945b-2c229b17e6ba req-2be245f0-dc5f-4095-93e7-403584782f83 service nova] [instance: b846d9ae-759a-4898-9ede-091819325701] Refreshing instance network info cache due to event network-changed-5c9195aa-a9fa-47b3-824d-d90372927066. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 594.824510] env[60764]: DEBUG oslo_concurrency.lockutils [req-222e1fd5-0a33-42f1-945b-2c229b17e6ba req-2be245f0-dc5f-4095-93e7-403584782f83 service nova] Acquiring lock "refresh_cache-b846d9ae-759a-4898-9ede-091819325701" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 594.824656] env[60764]: DEBUG oslo_concurrency.lockutils [req-222e1fd5-0a33-42f1-945b-2c229b17e6ba req-2be245f0-dc5f-4095-93e7-403584782f83 service nova] Acquired lock "refresh_cache-b846d9ae-759a-4898-9ede-091819325701" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 594.825896] env[60764]: DEBUG nova.network.neutron [req-222e1fd5-0a33-42f1-945b-2c229b17e6ba req-2be245f0-dc5f-4095-93e7-403584782f83 service nova] [instance: b846d9ae-759a-4898-9ede-091819325701] Refreshing network info cache for port 5c9195aa-a9fa-47b3-824d-d90372927066 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 594.934046] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204863, 'name': CreateVM_Task} progress is 99%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 595.219340] env[60764]: DEBUG nova.network.neutron [req-222e1fd5-0a33-42f1-945b-2c229b17e6ba req-2be245f0-dc5f-4095-93e7-403584782f83 service nova] [instance: b846d9ae-759a-4898-9ede-091819325701] Updated VIF entry in instance network info cache for port 5c9195aa-a9fa-47b3-824d-d90372927066. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 595.219340] env[60764]: DEBUG nova.network.neutron [req-222e1fd5-0a33-42f1-945b-2c229b17e6ba req-2be245f0-dc5f-4095-93e7-403584782f83 service nova] [instance: b846d9ae-759a-4898-9ede-091819325701] Updating instance_info_cache with network_info: [{"id": "5c9195aa-a9fa-47b3-824d-d90372927066", "address": "fa:16:3e:4f:c7:27", "network": {"id": "b3df541b-a60e-48f6-8a76-9aa06ec1904c", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-57555243-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b1287a3b6f504021baeedfadf0887bb8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6046aec4-feda-4ef9-bf4a-800de1e0cd3b", "external-id": "nsx-vlan-transportzone-903", "segmentation_id": 903, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c9195aa-a9", "ovs_interfaceid": "5c9195aa-a9fa-47b3-824d-d90372927066", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 595.232818] env[60764]: DEBUG oslo_concurrency.lockutils [req-222e1fd5-0a33-42f1-945b-2c229b17e6ba req-2be245f0-dc5f-4095-93e7-403584782f83 service nova] Releasing lock "refresh_cache-b846d9ae-759a-4898-9ede-091819325701" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 595.429922] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204863, 'name': CreateVM_Task} progress is 99%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 595.931812] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204863, 'name': CreateVM_Task} progress is 99%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 596.432055] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204863, 'name': CreateVM_Task, 'duration_secs': 1.580952} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 596.432055] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b846d9ae-759a-4898-9ede-091819325701] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 596.432608] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 596.432914] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 596.433396] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 596.433759] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-54707af2-4eba-4058-88e6-fb194b6f6f2e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 596.439421] env[60764]: DEBUG oslo_vmware.api [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Waiting for the task: (returnval){ [ 596.439421] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52114402-6696-9f86-7917-38e4fc315863" [ 596.439421] env[60764]: _type = "Task" [ 596.439421] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 596.451962] env[60764]: DEBUG oslo_vmware.api [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52114402-6696-9f86-7917-38e4fc315863, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 596.950648] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 596.951081] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 596.951414] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 598.466826] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Acquiring lock "6af62a04-4a13-46c7-a0b2-28768c789f23" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 598.467229] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Lock "6af62a04-4a13-46c7-a0b2-28768c789f23" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 601.859680] env[60764]: DEBUG oslo_concurrency.lockutils [None req-606aca73-fcb7-4124-b419-7f26859b5b5d tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Acquiring lock "a48f97b9-8720-4ad2-82a3-bae679b6b2ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 601.860084] env[60764]: DEBUG oslo_concurrency.lockutils [None req-606aca73-fcb7-4124-b419-7f26859b5b5d tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "a48f97b9-8720-4ad2-82a3-bae679b6b2ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 603.904511] env[60764]: DEBUG oslo_concurrency.lockutils [None req-90b2c84b-051b-47c5-b3b5-b2639dfcb03a tempest-ServerShowV254Test-1358377146 tempest-ServerShowV254Test-1358377146-project-member] Acquiring lock "8eb675b6-f73d-47a0-af75-63b5a8800e69" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 603.904811] env[60764]: DEBUG oslo_concurrency.lockutils [None req-90b2c84b-051b-47c5-b3b5-b2639dfcb03a tempest-ServerShowV254Test-1358377146 tempest-ServerShowV254Test-1358377146-project-member] Lock "8eb675b6-f73d-47a0-af75-63b5a8800e69" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 605.693616] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b336dd07-f201-4870-bd01-49938624ba7b tempest-ServerActionsV293TestJSON-292591111 tempest-ServerActionsV293TestJSON-292591111-project-member] Acquiring lock "fdd51c42-e0c1-4ad1-aee9-f091cfec0030" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 605.693991] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b336dd07-f201-4870-bd01-49938624ba7b tempest-ServerActionsV293TestJSON-292591111 tempest-ServerActionsV293TestJSON-292591111-project-member] Lock "fdd51c42-e0c1-4ad1-aee9-f091cfec0030" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 608.105681] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7c8d3f17-ec44-4196-8a63-79896cae974d tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] Acquiring lock "207825f2-e4aa-4747-bd79-384773b3d516" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 608.106100] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7c8d3f17-ec44-4196-8a63-79896cae974d tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] Lock "207825f2-e4aa-4747-bd79-384773b3d516" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 608.140779] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7c8d3f17-ec44-4196-8a63-79896cae974d tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] Acquiring lock "57919d5b-769c-4752-873c-d78bb61f5800" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 608.141012] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7c8d3f17-ec44-4196-8a63-79896cae974d tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] Lock "57919d5b-769c-4752-873c-d78bb61f5800" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 609.852873] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 609.881126] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 609.881382] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 609.881585] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 609.881848] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 609.882232] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 609.882232] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 609.894220] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 609.894510] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 609.894603] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 609.894763] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 609.895852] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54800581-3247-4406-ae97-62675d480c28 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 609.905396] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be32d123-3b44-43b9-832d-7e40337ba93b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 609.920245] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80c67307-2c8a-4308-8caf-1d83630073b3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 609.927138] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4ad0859-7211-455f-9440-0c0ac4e4d16e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 609.960571] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181243MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 609.960742] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 609.960949] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 610.041624] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e60a6397-30ad-48cb-ab52-7ae977615dc3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 610.041796] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2f530484-d828-4b65-a81e-c1c1a84ec903 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 610.041926] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 673185d0-9e2c-49dc-8323-f8b30a65b59d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 610.042067] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2696525a-3366-45a1-b413-8e4e0bd9d6c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 610.042190] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance bea83327-9479-46b2-bd78-c81d72359e8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 610.042310] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 8d32e9d3-43ff-47dd-a52b-102edbe3af11 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 610.042427] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7957eb49-d540-4c4e-a86a-1ea3631fb5ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 610.042544] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 4a0b0a82-3910-4201-b1f7-34c862667e3c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 610.042655] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ff2ef5e9-f543-4592-9896-e2c75369a971 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 610.042973] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b846d9ae-759a-4898-9ede-091819325701 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 610.067583] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.092601] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 437e0c0d-6d0e-4465-9651-14e420b646ae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.104740] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 1f11c625-166f-4609-badf-da4dd9475c37 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.115306] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7a378ea2-b981-443a-a925-9819ac5b979f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.127542] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 72e8a8e0-6229-4557-889b-73851a13dbc1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.137093] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 586d9ca2-c287-49bc-bf61-a5140ceaddea has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.147993] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 72225dcc-d210-49cf-8395-056bf4f7f652 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.158278] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7d8b4524-f867-4605-a468-a1c39e77dabd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.170211] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 59c6c887-b973-4a03-ba64-4ad12be45f64 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.181331] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b154a29f-564d-449d-8321-8ded1d4ec29b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.191814] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 330f33a1-cc70-4346-b6c5-e26720ed72f0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.202289] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b24a1521-7fcb-4369-ab8e-211690508a67 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.213260] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 130cd4e5-9df4-4428-abaa-d937d73d9950 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.223446] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c80cde37-2b69-46f7-9ca6-bfa618a18b1e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.237366] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7e3c6624-3c29-4df0-b417-575976b2f0f7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.252020] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6af62a04-4a13-46c7-a0b2-28768c789f23 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.268688] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a48f97b9-8720-4ad2-82a3-bae679b6b2ef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.302813] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 8eb675b6-f73d-47a0-af75-63b5a8800e69 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.319371] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance fdd51c42-e0c1-4ad1-aee9-f091cfec0030 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.331921] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 207825f2-e4aa-4747-bd79-384773b3d516 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.349398] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 57919d5b-769c-4752-873c-d78bb61f5800 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 610.349716] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 610.349828] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 610.728593] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc269163-4c5d-483b-925b-5411163bf0ac {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.737052] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a027cd8c-1144-4b8e-96d7-c09fa20d25b7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.767585] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae48bb39-9ef4-4de6-a5d4-9c79ea3bdb55 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.775568] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2f21303-04d5-4bb7-8efd-b04c74f3e43d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.789402] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 610.798492] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 610.811842] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 610.811842] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.851s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 611.260603] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 611.260923] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 611.261031] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 611.261143] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 611.281317] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 611.281461] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 611.281593] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 611.281713] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 611.281831] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 611.281953] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 611.282087] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 611.282218] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 611.282337] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 611.282491] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b846d9ae-759a-4898-9ede-091819325701] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 611.282613] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 611.283112] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 611.283299] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 639.857132] env[60764]: WARNING oslo_vmware.rw_handles [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 639.857132] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 639.857132] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 639.857132] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 639.857132] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 639.857132] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 639.857132] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 639.857132] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 639.857132] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 639.857132] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 639.857132] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 639.857132] env[60764]: ERROR oslo_vmware.rw_handles [ 639.857630] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/fbce997e-56fe-4ae2-96f7-3dbfd94a77cf/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 639.859144] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 639.859433] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Copying Virtual Disk [datastore2] vmware_temp/fbce997e-56fe-4ae2-96f7-3dbfd94a77cf/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/fbce997e-56fe-4ae2-96f7-3dbfd94a77cf/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 639.859735] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3b5a886b-38c7-4cfe-bf3d-65ab851ea256 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 639.867613] env[60764]: DEBUG oslo_vmware.api [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Waiting for the task: (returnval){ [ 639.867613] env[60764]: value = "task-2204868" [ 639.867613] env[60764]: _type = "Task" [ 639.867613] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 639.875825] env[60764]: DEBUG oslo_vmware.api [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Task: {'id': task-2204868, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 640.377433] env[60764]: DEBUG oslo_vmware.exceptions [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 640.377712] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 640.378266] env[60764]: ERROR nova.compute.manager [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 640.378266] env[60764]: Faults: ['InvalidArgument'] [ 640.378266] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Traceback (most recent call last): [ 640.378266] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 640.378266] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] yield resources [ 640.378266] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 640.378266] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] self.driver.spawn(context, instance, image_meta, [ 640.378266] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 640.378266] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 640.378266] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 640.378266] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] self._fetch_image_if_missing(context, vi) [ 640.378266] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 640.378567] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] image_cache(vi, tmp_image_ds_loc) [ 640.378567] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 640.378567] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] vm_util.copy_virtual_disk( [ 640.378567] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 640.378567] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] session._wait_for_task(vmdk_copy_task) [ 640.378567] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 640.378567] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] return self.wait_for_task(task_ref) [ 640.378567] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 640.378567] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] return evt.wait() [ 640.378567] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 640.378567] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] result = hub.switch() [ 640.378567] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 640.378567] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] return self.greenlet.switch() [ 640.378928] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 640.378928] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] self.f(*self.args, **self.kw) [ 640.378928] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 640.378928] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] raise exceptions.translate_fault(task_info.error) [ 640.378928] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 640.378928] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Faults: ['InvalidArgument'] [ 640.378928] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] [ 640.378928] env[60764]: INFO nova.compute.manager [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Terminating instance [ 640.380130] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 640.380329] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 640.380650] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1914e45d-62a1-496f-993a-d6c1e90d7d96 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.382797] env[60764]: DEBUG nova.compute.manager [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 640.382985] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 640.383732] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a420ed1b-c078-4f10-a10c-101675f236f4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.390487] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 640.390683] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-03b8ea91-364c-4e7f-87b5-48aaaebd32eb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.392846] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 640.393025] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 640.393980] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-45989978-53ab-4bc5-b462-99835e3e1f26 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.398533] env[60764]: DEBUG oslo_vmware.api [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Waiting for the task: (returnval){ [ 640.398533] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52997299-6446-c69e-6998-3ad110505df0" [ 640.398533] env[60764]: _type = "Task" [ 640.398533] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 640.407358] env[60764]: DEBUG oslo_vmware.api [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52997299-6446-c69e-6998-3ad110505df0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 640.464604] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 640.464809] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 640.465023] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Deleting the datastore file [datastore2] e60a6397-30ad-48cb-ab52-7ae977615dc3 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 640.465346] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-52601dae-df17-4103-a8e0-55f2ab238e5c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.471666] env[60764]: DEBUG oslo_vmware.api [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Waiting for the task: (returnval){ [ 640.471666] env[60764]: value = "task-2204870" [ 640.471666] env[60764]: _type = "Task" [ 640.471666] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 640.478943] env[60764]: DEBUG oslo_vmware.api [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Task: {'id': task-2204870, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 640.909337] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 640.909623] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Creating directory with path [datastore2] vmware_temp/db6a5fc1-865d-443c-b0d9-1f6ba0a51850/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 640.909899] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-60eb1a5e-73a4-4c16-977d-8cfbc13b56dc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.922246] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Created directory with path [datastore2] vmware_temp/db6a5fc1-865d-443c-b0d9-1f6ba0a51850/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 640.922486] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Fetch image to [datastore2] vmware_temp/db6a5fc1-865d-443c-b0d9-1f6ba0a51850/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 640.922642] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/db6a5fc1-865d-443c-b0d9-1f6ba0a51850/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 640.923427] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e04d760a-320d-4f3e-b189-5d8db3472b53 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.931056] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10c1a1dc-4546-486b-a5a2-2ca4a05f5155 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.940072] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-576bae2e-8514-4e1c-aa73-164a0be3c076 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.976367] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f492f8a-355f-4b9c-bf86-3b0c5c7411d4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.983944] env[60764]: DEBUG oslo_vmware.api [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Task: {'id': task-2204870, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074811} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 640.985467] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 640.985656] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 640.985840] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 640.986052] env[60764]: INFO nova.compute.manager [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Took 0.60 seconds to destroy the instance on the hypervisor. [ 640.988147] env[60764]: DEBUG nova.compute.claims [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 640.988323] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 640.988794] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 640.992228] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ee6335da-b1bc-4ee5-ba2f-942b89c1a244 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 641.013076] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 641.071749] env[60764]: DEBUG oslo_vmware.rw_handles [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/db6a5fc1-865d-443c-b0d9-1f6ba0a51850/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 641.135248] env[60764]: DEBUG oslo_vmware.rw_handles [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 641.135439] env[60764]: DEBUG oslo_vmware.rw_handles [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/db6a5fc1-865d-443c-b0d9-1f6ba0a51850/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 641.466505] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bf24995-3a2e-42b1-b538-966772484c2f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 641.474744] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f5f4c8f-2d6b-460d-b243-635e8004235d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 641.504369] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41709ca4-7d81-4a47-81c3-215717529ef2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 641.511515] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-121c7e86-36b7-473b-9761-9a8e7e00b1d4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 641.524387] env[60764]: DEBUG nova.compute.provider_tree [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 641.533929] env[60764]: DEBUG nova.scheduler.client.report [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 641.548675] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.560s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 641.549216] env[60764]: ERROR nova.compute.manager [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 641.549216] env[60764]: Faults: ['InvalidArgument'] [ 641.549216] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Traceback (most recent call last): [ 641.549216] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 641.549216] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] self.driver.spawn(context, instance, image_meta, [ 641.549216] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 641.549216] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 641.549216] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 641.549216] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] self._fetch_image_if_missing(context, vi) [ 641.549216] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 641.549216] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] image_cache(vi, tmp_image_ds_loc) [ 641.549216] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 641.549867] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] vm_util.copy_virtual_disk( [ 641.549867] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 641.549867] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] session._wait_for_task(vmdk_copy_task) [ 641.549867] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 641.549867] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] return self.wait_for_task(task_ref) [ 641.549867] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 641.549867] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] return evt.wait() [ 641.549867] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 641.549867] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] result = hub.switch() [ 641.549867] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 641.549867] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] return self.greenlet.switch() [ 641.549867] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 641.549867] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] self.f(*self.args, **self.kw) [ 641.550148] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 641.550148] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] raise exceptions.translate_fault(task_info.error) [ 641.550148] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 641.550148] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Faults: ['InvalidArgument'] [ 641.550148] env[60764]: ERROR nova.compute.manager [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] [ 641.550148] env[60764]: DEBUG nova.compute.utils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 641.551352] env[60764]: DEBUG nova.compute.manager [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Build of instance e60a6397-30ad-48cb-ab52-7ae977615dc3 was re-scheduled: A specified parameter was not correct: fileType [ 641.551352] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 641.551713] env[60764]: DEBUG nova.compute.manager [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 641.551885] env[60764]: DEBUG nova.compute.manager [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 641.552068] env[60764]: DEBUG nova.compute.manager [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 641.552232] env[60764]: DEBUG nova.network.neutron [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 641.949995] env[60764]: DEBUG nova.network.neutron [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 641.971466] env[60764]: INFO nova.compute.manager [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] [instance: e60a6397-30ad-48cb-ab52-7ae977615dc3] Took 0.42 seconds to deallocate network for instance. [ 642.088543] env[60764]: INFO nova.scheduler.client.report [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Deleted allocations for instance e60a6397-30ad-48cb-ab52-7ae977615dc3 [ 642.110741] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5516ef92-1a08-4871-836e-40123aeb6fde tempest-FloatingIPsAssociationTestJSON-1962276649 tempest-FloatingIPsAssociationTestJSON-1962276649-project-member] Lock "e60a6397-30ad-48cb-ab52-7ae977615dc3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 105.965s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 642.139324] env[60764]: DEBUG nova.compute.manager [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 642.189248] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 642.189497] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 642.190985] env[60764]: INFO nova.compute.claims [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 642.610796] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3786cd50-7d75-43fa-a1bb-68e60d544a28 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 642.618522] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39bb0a6a-44ed-4926-a3ac-c648353cde08 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 642.648128] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3f84429-9f47-43cb-9b54-90c27350054c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 642.655427] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb1abf1d-1bc2-420a-b6d6-6505acac97bd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 642.670747] env[60764]: DEBUG nova.compute.provider_tree [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 642.677497] env[60764]: DEBUG nova.scheduler.client.report [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 642.695299] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.506s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 642.695800] env[60764]: DEBUG nova.compute.manager [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 642.728758] env[60764]: DEBUG nova.compute.utils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 642.730475] env[60764]: DEBUG nova.compute.manager [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 642.730647] env[60764]: DEBUG nova.network.neutron [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 642.741290] env[60764]: DEBUG nova.compute.manager [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 642.802641] env[60764]: DEBUG nova.compute.manager [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 642.818568] env[60764]: DEBUG nova.policy [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'edce62c549c545b6a021c6e1400106d8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4be0d93cd01348b5ae03891c7f8fe1a4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 642.837088] env[60764]: DEBUG nova.virt.hardware [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 642.837376] env[60764]: DEBUG nova.virt.hardware [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 642.837558] env[60764]: DEBUG nova.virt.hardware [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 642.837744] env[60764]: DEBUG nova.virt.hardware [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 642.837891] env[60764]: DEBUG nova.virt.hardware [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 642.838099] env[60764]: DEBUG nova.virt.hardware [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 642.838320] env[60764]: DEBUG nova.virt.hardware [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 642.838476] env[60764]: DEBUG nova.virt.hardware [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 642.838654] env[60764]: DEBUG nova.virt.hardware [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 642.838827] env[60764]: DEBUG nova.virt.hardware [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 642.838998] env[60764]: DEBUG nova.virt.hardware [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 642.840119] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a571c1d-b023-4a17-bb61-856edadbf052 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 642.848106] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a032b98-71bb-49d0-abae-2643914f0e22 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.294924] env[60764]: DEBUG nova.network.neutron [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Successfully created port: 11abfa60-26da-419a-8517-89060ad37c34 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 644.058029] env[60764]: DEBUG nova.network.neutron [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Successfully updated port: 11abfa60-26da-419a-8517-89060ad37c34 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 644.072623] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Acquiring lock "refresh_cache-4e82aa9c-ae76-4b49-b666-5d5adc22d1b5" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 644.072769] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Acquired lock "refresh_cache-4e82aa9c-ae76-4b49-b666-5d5adc22d1b5" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 644.072917] env[60764]: DEBUG nova.network.neutron [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 644.118268] env[60764]: DEBUG nova.network.neutron [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 644.305332] env[60764]: DEBUG nova.compute.manager [req-838a7a0e-3d4d-4802-9f34-9acdc4e1036d req-17bcb219-bae7-4979-be4f-dc2311ceaf1e service nova] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Received event network-vif-plugged-11abfa60-26da-419a-8517-89060ad37c34 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 644.305582] env[60764]: DEBUG oslo_concurrency.lockutils [req-838a7a0e-3d4d-4802-9f34-9acdc4e1036d req-17bcb219-bae7-4979-be4f-dc2311ceaf1e service nova] Acquiring lock "4e82aa9c-ae76-4b49-b666-5d5adc22d1b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 644.305816] env[60764]: DEBUG oslo_concurrency.lockutils [req-838a7a0e-3d4d-4802-9f34-9acdc4e1036d req-17bcb219-bae7-4979-be4f-dc2311ceaf1e service nova] Lock "4e82aa9c-ae76-4b49-b666-5d5adc22d1b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 644.306045] env[60764]: DEBUG oslo_concurrency.lockutils [req-838a7a0e-3d4d-4802-9f34-9acdc4e1036d req-17bcb219-bae7-4979-be4f-dc2311ceaf1e service nova] Lock "4e82aa9c-ae76-4b49-b666-5d5adc22d1b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 644.306672] env[60764]: DEBUG nova.compute.manager [req-838a7a0e-3d4d-4802-9f34-9acdc4e1036d req-17bcb219-bae7-4979-be4f-dc2311ceaf1e service nova] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] No waiting events found dispatching network-vif-plugged-11abfa60-26da-419a-8517-89060ad37c34 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 644.306672] env[60764]: WARNING nova.compute.manager [req-838a7a0e-3d4d-4802-9f34-9acdc4e1036d req-17bcb219-bae7-4979-be4f-dc2311ceaf1e service nova] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Received unexpected event network-vif-plugged-11abfa60-26da-419a-8517-89060ad37c34 for instance with vm_state building and task_state spawning. [ 644.361060] env[60764]: DEBUG nova.network.neutron [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Updating instance_info_cache with network_info: [{"id": "11abfa60-26da-419a-8517-89060ad37c34", "address": "fa:16:3e:48:f3:24", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.187", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap11abfa60-26", "ovs_interfaceid": "11abfa60-26da-419a-8517-89060ad37c34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 644.375726] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Releasing lock "refresh_cache-4e82aa9c-ae76-4b49-b666-5d5adc22d1b5" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 644.375726] env[60764]: DEBUG nova.compute.manager [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Instance network_info: |[{"id": "11abfa60-26da-419a-8517-89060ad37c34", "address": "fa:16:3e:48:f3:24", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.187", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap11abfa60-26", "ovs_interfaceid": "11abfa60-26da-419a-8517-89060ad37c34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 644.376164] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:48:f3:24', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9f87a752-ebb0-49a4-a67b-e356fa45b89b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '11abfa60-26da-419a-8517-89060ad37c34', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 644.383493] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Creating folder: Project (4be0d93cd01348b5ae03891c7f8fe1a4). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 644.384132] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fc3b3354-0e5a-4526-b7cf-b401dfb85e44 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.395847] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Created folder: Project (4be0d93cd01348b5ae03891c7f8fe1a4) in parent group-v449629. [ 644.396058] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Creating folder: Instances. Parent ref: group-v449667. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 644.396305] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4a5fa7f9-6b7b-4daf-a719-1d6e8a7cab46 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.405315] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Created folder: Instances in parent group-v449667. [ 644.405550] env[60764]: DEBUG oslo.service.loopingcall [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 644.405733] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 644.405928] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6d535e4c-016d-42df-8d89-5941e8a0769d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.426104] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 644.426104] env[60764]: value = "task-2204873" [ 644.426104] env[60764]: _type = "Task" [ 644.426104] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 644.435133] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204873, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 644.939009] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204873, 'name': CreateVM_Task, 'duration_secs': 0.303573} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 644.939238] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 644.939874] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 644.940048] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 644.940352] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 644.940594] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e872a61b-7603-4126-9955-ce12fafd6915 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.944810] env[60764]: DEBUG oslo_vmware.api [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Waiting for the task: (returnval){ [ 644.944810] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52bc1783-c992-aa28-aed1-735e7cf8f383" [ 644.944810] env[60764]: _type = "Task" [ 644.944810] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 644.952170] env[60764]: DEBUG oslo_vmware.api [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52bc1783-c992-aa28-aed1-735e7cf8f383, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 645.454789] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 645.455072] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 645.455294] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 646.377484] env[60764]: DEBUG nova.compute.manager [req-410d7372-d8d4-43e2-be25-886efc61abba req-b3df88d3-a1b2-4ebf-875d-282ccaf1d187 service nova] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Received event network-changed-11abfa60-26da-419a-8517-89060ad37c34 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 646.377645] env[60764]: DEBUG nova.compute.manager [req-410d7372-d8d4-43e2-be25-886efc61abba req-b3df88d3-a1b2-4ebf-875d-282ccaf1d187 service nova] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Refreshing instance network info cache due to event network-changed-11abfa60-26da-419a-8517-89060ad37c34. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 646.377856] env[60764]: DEBUG oslo_concurrency.lockutils [req-410d7372-d8d4-43e2-be25-886efc61abba req-b3df88d3-a1b2-4ebf-875d-282ccaf1d187 service nova] Acquiring lock "refresh_cache-4e82aa9c-ae76-4b49-b666-5d5adc22d1b5" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 646.378017] env[60764]: DEBUG oslo_concurrency.lockutils [req-410d7372-d8d4-43e2-be25-886efc61abba req-b3df88d3-a1b2-4ebf-875d-282ccaf1d187 service nova] Acquired lock "refresh_cache-4e82aa9c-ae76-4b49-b666-5d5adc22d1b5" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 646.378170] env[60764]: DEBUG nova.network.neutron [req-410d7372-d8d4-43e2-be25-886efc61abba req-b3df88d3-a1b2-4ebf-875d-282ccaf1d187 service nova] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Refreshing network info cache for port 11abfa60-26da-419a-8517-89060ad37c34 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 646.948082] env[60764]: DEBUG nova.network.neutron [req-410d7372-d8d4-43e2-be25-886efc61abba req-b3df88d3-a1b2-4ebf-875d-282ccaf1d187 service nova] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Updated VIF entry in instance network info cache for port 11abfa60-26da-419a-8517-89060ad37c34. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 646.948439] env[60764]: DEBUG nova.network.neutron [req-410d7372-d8d4-43e2-be25-886efc61abba req-b3df88d3-a1b2-4ebf-875d-282ccaf1d187 service nova] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Updating instance_info_cache with network_info: [{"id": "11abfa60-26da-419a-8517-89060ad37c34", "address": "fa:16:3e:48:f3:24", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.187", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap11abfa60-26", "ovs_interfaceid": "11abfa60-26da-419a-8517-89060ad37c34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 646.961294] env[60764]: DEBUG oslo_concurrency.lockutils [req-410d7372-d8d4-43e2-be25-886efc61abba req-b3df88d3-a1b2-4ebf-875d-282ccaf1d187 service nova] Releasing lock "refresh_cache-4e82aa9c-ae76-4b49-b666-5d5adc22d1b5" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 669.333938] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 669.334328] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 669.350182] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 669.350524] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 669.350524] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 669.350801] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 669.351905] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b34a1658-cd67-4a6e-8e41-6b8e5c0adc07 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 669.359098] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Acquiring lock "c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 669.359325] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Lock "c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 669.364827] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68544380-d412-48f3-a14d-1b1a3f3c8a94 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 669.381800] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f0316c1-cc40-4648-944f-2bfc0247522f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 669.390326] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17e6d0fe-40ca-4870-8205-7bffcd4635fd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 669.421635] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181251MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 669.421816] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 669.421988] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 669.498598] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2f530484-d828-4b65-a81e-c1c1a84ec903 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 669.498802] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 673185d0-9e2c-49dc-8323-f8b30a65b59d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 669.498929] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2696525a-3366-45a1-b413-8e4e0bd9d6c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 669.499131] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance bea83327-9479-46b2-bd78-c81d72359e8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 669.499575] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 8d32e9d3-43ff-47dd-a52b-102edbe3af11 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 669.499575] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7957eb49-d540-4c4e-a86a-1ea3631fb5ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 669.499575] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 4a0b0a82-3910-4201-b1f7-34c862667e3c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 669.499694] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ff2ef5e9-f543-4592-9896-e2c75369a971 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 669.499728] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b846d9ae-759a-4898-9ede-091819325701 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 669.499875] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 669.511323] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 437e0c0d-6d0e-4465-9651-14e420b646ae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.522056] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 1f11c625-166f-4609-badf-da4dd9475c37 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.532817] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7a378ea2-b981-443a-a925-9819ac5b979f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.543316] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 72e8a8e0-6229-4557-889b-73851a13dbc1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.554009] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 586d9ca2-c287-49bc-bf61-a5140ceaddea has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.563828] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 72225dcc-d210-49cf-8395-056bf4f7f652 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.573590] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7d8b4524-f867-4605-a468-a1c39e77dabd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.588727] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 59c6c887-b973-4a03-ba64-4ad12be45f64 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.599424] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b154a29f-564d-449d-8321-8ded1d4ec29b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.609619] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 330f33a1-cc70-4346-b6c5-e26720ed72f0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.619063] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b24a1521-7fcb-4369-ab8e-211690508a67 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.632664] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 130cd4e5-9df4-4428-abaa-d937d73d9950 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.642725] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c80cde37-2b69-46f7-9ca6-bfa618a18b1e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.655934] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7e3c6624-3c29-4df0-b417-575976b2f0f7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.664556] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6af62a04-4a13-46c7-a0b2-28768c789f23 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.676631] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a48f97b9-8720-4ad2-82a3-bae679b6b2ef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.691252] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 8eb675b6-f73d-47a0-af75-63b5a8800e69 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.700902] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance fdd51c42-e0c1-4ad1-aee9-f091cfec0030 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.710896] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 207825f2-e4aa-4747-bd79-384773b3d516 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.723538] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 57919d5b-769c-4752-873c-d78bb61f5800 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.733728] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 669.733975] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 669.734144] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 670.073061] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d57e70d1-d557-40b1-97c2-0a771479dc4d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.080969] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0363bf85-9304-4715-9eec-904aeb1280d9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.111166] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ed825af-dfc9-4549-a355-c35a99fe097f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.119072] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56a6e87d-fba2-4cb6-94d7-110a5e7cbd6a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.132354] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 670.140795] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 670.158256] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 670.158491] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.736s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 671.153987] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 671.325453] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 671.329086] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 671.329240] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 671.329363] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 671.352340] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 671.352498] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 671.352633] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 671.352758] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 671.352876] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 671.353008] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 671.353133] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 671.353251] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 671.353369] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b846d9ae-759a-4898-9ede-091819325701] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 671.353559] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 671.353680] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 671.354183] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 671.354355] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 671.354513] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 671.354645] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 672.330784] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 687.828607] env[60764]: WARNING oslo_vmware.rw_handles [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 687.828607] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 687.828607] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 687.828607] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 687.828607] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 687.828607] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 687.828607] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 687.828607] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 687.828607] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 687.828607] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 687.828607] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 687.828607] env[60764]: ERROR oslo_vmware.rw_handles [ 687.829325] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/db6a5fc1-865d-443c-b0d9-1f6ba0a51850/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 687.830703] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 687.830958] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Copying Virtual Disk [datastore2] vmware_temp/db6a5fc1-865d-443c-b0d9-1f6ba0a51850/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/db6a5fc1-865d-443c-b0d9-1f6ba0a51850/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 687.831263] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5c956396-adad-49d2-bcf0-4ad646d3c96d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 687.840082] env[60764]: DEBUG oslo_vmware.api [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Waiting for the task: (returnval){ [ 687.840082] env[60764]: value = "task-2204884" [ 687.840082] env[60764]: _type = "Task" [ 687.840082] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 687.849823] env[60764]: DEBUG oslo_vmware.api [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Task: {'id': task-2204884, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 688.351240] env[60764]: DEBUG oslo_vmware.exceptions [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 688.351536] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 688.352114] env[60764]: ERROR nova.compute.manager [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 688.352114] env[60764]: Faults: ['InvalidArgument'] [ 688.352114] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Traceback (most recent call last): [ 688.352114] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 688.352114] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] yield resources [ 688.352114] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 688.352114] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] self.driver.spawn(context, instance, image_meta, [ 688.352114] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 688.352114] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] self._vmops.spawn(context, instance, image_meta, injected_files, [ 688.352114] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 688.352114] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] self._fetch_image_if_missing(context, vi) [ 688.352114] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 688.352586] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] image_cache(vi, tmp_image_ds_loc) [ 688.352586] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 688.352586] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] vm_util.copy_virtual_disk( [ 688.352586] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 688.352586] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] session._wait_for_task(vmdk_copy_task) [ 688.352586] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 688.352586] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] return self.wait_for_task(task_ref) [ 688.352586] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 688.352586] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] return evt.wait() [ 688.352586] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 688.352586] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] result = hub.switch() [ 688.352586] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 688.352586] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] return self.greenlet.switch() [ 688.352951] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 688.352951] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] self.f(*self.args, **self.kw) [ 688.352951] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 688.352951] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] raise exceptions.translate_fault(task_info.error) [ 688.352951] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 688.352951] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Faults: ['InvalidArgument'] [ 688.352951] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] [ 688.352951] env[60764]: INFO nova.compute.manager [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Terminating instance [ 688.353965] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 688.354188] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 688.354427] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c8579604-fdd7-4c6b-85f1-3b0321c736bb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.356756] env[60764]: DEBUG nova.compute.manager [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 688.356972] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 688.357713] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e5e27b9-771b-4f66-be15-899bd0f9af6a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.364726] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 688.364949] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ccdcc641-1b4d-48ae-8706-4ca2e4fe068b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.367168] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 688.367338] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 688.368296] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9712ece3-7ef5-471d-9fb3-3652a67084ed {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.372891] env[60764]: DEBUG oslo_vmware.api [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Waiting for the task: (returnval){ [ 688.372891] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]528c2695-9f8c-571e-22dc-8bd15e378b2e" [ 688.372891] env[60764]: _type = "Task" [ 688.372891] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 688.380097] env[60764]: DEBUG oslo_vmware.api [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]528c2695-9f8c-571e-22dc-8bd15e378b2e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 688.435531] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 688.435732] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 688.435954] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Deleting the datastore file [datastore2] 2f530484-d828-4b65-a81e-c1c1a84ec903 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 688.436734] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-af51feba-f1db-43d9-9d27-497a8fa14653 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.443149] env[60764]: DEBUG oslo_vmware.api [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Waiting for the task: (returnval){ [ 688.443149] env[60764]: value = "task-2204886" [ 688.443149] env[60764]: _type = "Task" [ 688.443149] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 688.450072] env[60764]: DEBUG oslo_vmware.api [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Task: {'id': task-2204886, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 688.883361] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 688.883630] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Creating directory with path [datastore2] vmware_temp/101e66c1-0160-4990-ac61-3f219e377cfc/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 688.883854] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f66b0a9d-4f80-4afa-8ff3-6784a881a13d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.897189] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Created directory with path [datastore2] vmware_temp/101e66c1-0160-4990-ac61-3f219e377cfc/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 688.897189] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Fetch image to [datastore2] vmware_temp/101e66c1-0160-4990-ac61-3f219e377cfc/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 688.897189] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/101e66c1-0160-4990-ac61-3f219e377cfc/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 688.897189] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea7daadc-87e9-4a66-b55f-d14e94a1a42e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.902650] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-894e0bca-5455-4af1-8d8f-80e8ca8a9b26 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.911715] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-335d522a-d66e-45ba-acfb-acb32ef9d48f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.941945] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bfe7f29-45f7-40d1-b39b-31cf1fc5b068 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.962013] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4e3c49a0-7502-483b-9b8d-880fe39d8a50 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 688.964176] env[60764]: DEBUG oslo_vmware.api [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Task: {'id': task-2204886, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06422} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 688.964470] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 688.964616] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 688.964784] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 688.964946] env[60764]: INFO nova.compute.manager [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Took 0.61 seconds to destroy the instance on the hypervisor. [ 688.967704] env[60764]: DEBUG nova.compute.claims [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 688.967844] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 688.968070] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 688.985648] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 689.038503] env[60764]: DEBUG oslo_vmware.rw_handles [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/101e66c1-0160-4990-ac61-3f219e377cfc/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 689.102897] env[60764]: DEBUG oslo_vmware.rw_handles [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 689.102897] env[60764]: DEBUG oslo_vmware.rw_handles [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/101e66c1-0160-4990-ac61-3f219e377cfc/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 689.417654] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42f16b2a-3123-4c1b-b3e7-a759b86dcb1f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.425287] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d477a62f-63df-496e-bfb6-527fa5585033 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.453978] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ceaeb201-4df9-4125-8bec-532636fdc5bd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.460828] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ec6c4eb-2f30-4d9e-a0bd-f6ffaa69eb75 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 689.473473] env[60764]: DEBUG nova.compute.provider_tree [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 689.481956] env[60764]: DEBUG nova.scheduler.client.report [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 689.499589] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.531s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 689.500267] env[60764]: ERROR nova.compute.manager [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 689.500267] env[60764]: Faults: ['InvalidArgument'] [ 689.500267] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Traceback (most recent call last): [ 689.500267] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 689.500267] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] self.driver.spawn(context, instance, image_meta, [ 689.500267] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 689.500267] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] self._vmops.spawn(context, instance, image_meta, injected_files, [ 689.500267] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 689.500267] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] self._fetch_image_if_missing(context, vi) [ 689.500267] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 689.500267] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] image_cache(vi, tmp_image_ds_loc) [ 689.500267] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 689.500584] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] vm_util.copy_virtual_disk( [ 689.500584] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 689.500584] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] session._wait_for_task(vmdk_copy_task) [ 689.500584] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 689.500584] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] return self.wait_for_task(task_ref) [ 689.500584] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 689.500584] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] return evt.wait() [ 689.500584] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 689.500584] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] result = hub.switch() [ 689.500584] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 689.500584] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] return self.greenlet.switch() [ 689.500584] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 689.500584] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] self.f(*self.args, **self.kw) [ 689.500919] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 689.500919] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] raise exceptions.translate_fault(task_info.error) [ 689.500919] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 689.500919] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Faults: ['InvalidArgument'] [ 689.500919] env[60764]: ERROR nova.compute.manager [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] [ 689.500919] env[60764]: DEBUG nova.compute.utils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 689.502220] env[60764]: DEBUG nova.compute.manager [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Build of instance 2f530484-d828-4b65-a81e-c1c1a84ec903 was re-scheduled: A specified parameter was not correct: fileType [ 689.502220] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 689.502591] env[60764]: DEBUG nova.compute.manager [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 689.502759] env[60764]: DEBUG nova.compute.manager [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 689.502911] env[60764]: DEBUG nova.compute.manager [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 689.503086] env[60764]: DEBUG nova.network.neutron [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 689.844404] env[60764]: DEBUG nova.network.neutron [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 689.862266] env[60764]: INFO nova.compute.manager [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] [instance: 2f530484-d828-4b65-a81e-c1c1a84ec903] Took 0.35 seconds to deallocate network for instance. [ 689.968648] env[60764]: INFO nova.scheduler.client.report [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Deleted allocations for instance 2f530484-d828-4b65-a81e-c1c1a84ec903 [ 689.993572] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cfe43569-eb51-46c0-8870-d4dfd330de4b tempest-ServerDiagnosticsTest-868391637 tempest-ServerDiagnosticsTest-868391637-project-member] Lock "2f530484-d828-4b65-a81e-c1c1a84ec903" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 152.231s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 690.008933] env[60764]: DEBUG nova.compute.manager [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 690.066915] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 690.067191] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 690.069069] env[60764]: INFO nova.compute.claims [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 690.497366] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce0cfd12-6271-4870-8b0c-f38755b943a7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.505786] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38a9b68b-bc18-483c-b7b9-10a1e5238192 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.535317] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38c46325-a2f7-4dd6-b5c4-184addc217e4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.542436] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f16add1-3b8c-434e-b9db-9891a5fbe4e1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.555307] env[60764]: DEBUG nova.compute.provider_tree [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 690.564227] env[60764]: DEBUG nova.scheduler.client.report [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 690.580557] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.513s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 690.581074] env[60764]: DEBUG nova.compute.manager [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 690.617810] env[60764]: DEBUG nova.compute.utils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 690.619363] env[60764]: DEBUG nova.compute.manager [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 690.619363] env[60764]: DEBUG nova.network.neutron [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 690.629031] env[60764]: DEBUG nova.compute.manager [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 690.691231] env[60764]: DEBUG nova.policy [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ea494b5112cd4f3fab123a5fd644e440', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e4290efc4a24771a2bd3a21f3b431a8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 690.696824] env[60764]: DEBUG nova.compute.manager [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 690.723669] env[60764]: DEBUG nova.virt.hardware [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 690.723952] env[60764]: DEBUG nova.virt.hardware [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 690.724124] env[60764]: DEBUG nova.virt.hardware [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 690.724311] env[60764]: DEBUG nova.virt.hardware [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 690.724456] env[60764]: DEBUG nova.virt.hardware [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 690.724603] env[60764]: DEBUG nova.virt.hardware [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 690.724811] env[60764]: DEBUG nova.virt.hardware [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 690.724967] env[60764]: DEBUG nova.virt.hardware [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 690.725144] env[60764]: DEBUG nova.virt.hardware [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 690.725303] env[60764]: DEBUG nova.virt.hardware [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 690.725469] env[60764]: DEBUG nova.virt.hardware [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 690.726359] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f97ae5c0-05f2-49b8-8b79-dd9889375486 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 690.734176] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c81f9a5b-3894-40eb-87c4-261dfc57c307 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 691.109380] env[60764]: DEBUG nova.network.neutron [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Successfully created port: 2774a55a-402a-432a-a7ca-b5914ae4d93b {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 692.186308] env[60764]: DEBUG nova.network.neutron [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Successfully updated port: 2774a55a-402a-432a-a7ca-b5914ae4d93b {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 692.201201] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Acquiring lock "refresh_cache-437e0c0d-6d0e-4465-9651-14e420b646ae" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 692.201251] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Acquired lock "refresh_cache-437e0c0d-6d0e-4465-9651-14e420b646ae" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 692.201449] env[60764]: DEBUG nova.network.neutron [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 692.245814] env[60764]: DEBUG nova.network.neutron [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 692.297555] env[60764]: DEBUG nova.compute.manager [req-ac55abd5-62fc-4030-bfe0-a530919bdc09 req-96afb370-870e-4701-a6c4-6b9101ced1eb service nova] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Received event network-vif-plugged-2774a55a-402a-432a-a7ca-b5914ae4d93b {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 692.297555] env[60764]: DEBUG oslo_concurrency.lockutils [req-ac55abd5-62fc-4030-bfe0-a530919bdc09 req-96afb370-870e-4701-a6c4-6b9101ced1eb service nova] Acquiring lock "437e0c0d-6d0e-4465-9651-14e420b646ae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 692.297728] env[60764]: DEBUG oslo_concurrency.lockutils [req-ac55abd5-62fc-4030-bfe0-a530919bdc09 req-96afb370-870e-4701-a6c4-6b9101ced1eb service nova] Lock "437e0c0d-6d0e-4465-9651-14e420b646ae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 692.298085] env[60764]: DEBUG oslo_concurrency.lockutils [req-ac55abd5-62fc-4030-bfe0-a530919bdc09 req-96afb370-870e-4701-a6c4-6b9101ced1eb service nova] Lock "437e0c0d-6d0e-4465-9651-14e420b646ae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 692.298085] env[60764]: DEBUG nova.compute.manager [req-ac55abd5-62fc-4030-bfe0-a530919bdc09 req-96afb370-870e-4701-a6c4-6b9101ced1eb service nova] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] No waiting events found dispatching network-vif-plugged-2774a55a-402a-432a-a7ca-b5914ae4d93b {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 692.298721] env[60764]: WARNING nova.compute.manager [req-ac55abd5-62fc-4030-bfe0-a530919bdc09 req-96afb370-870e-4701-a6c4-6b9101ced1eb service nova] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Received unexpected event network-vif-plugged-2774a55a-402a-432a-a7ca-b5914ae4d93b for instance with vm_state building and task_state spawning. [ 692.500253] env[60764]: DEBUG nova.network.neutron [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Updating instance_info_cache with network_info: [{"id": "2774a55a-402a-432a-a7ca-b5914ae4d93b", "address": "fa:16:3e:9c:6f:80", "network": {"id": "6121c79a-dea2-457d-a2c5-70521898dc9c", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1849405885-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6e4290efc4a24771a2bd3a21f3b431a8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4e02e98f-44ce-42b7-a3ac-4034fae5d127", "external-id": "nsx-vlan-transportzone-874", "segmentation_id": 874, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2774a55a-40", "ovs_interfaceid": "2774a55a-402a-432a-a7ca-b5914ae4d93b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 692.513179] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Releasing lock "refresh_cache-437e0c0d-6d0e-4465-9651-14e420b646ae" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 692.513428] env[60764]: DEBUG nova.compute.manager [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Instance network_info: |[{"id": "2774a55a-402a-432a-a7ca-b5914ae4d93b", "address": "fa:16:3e:9c:6f:80", "network": {"id": "6121c79a-dea2-457d-a2c5-70521898dc9c", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1849405885-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6e4290efc4a24771a2bd3a21f3b431a8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4e02e98f-44ce-42b7-a3ac-4034fae5d127", "external-id": "nsx-vlan-transportzone-874", "segmentation_id": 874, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2774a55a-40", "ovs_interfaceid": "2774a55a-402a-432a-a7ca-b5914ae4d93b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 692.513823] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9c:6f:80', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4e02e98f-44ce-42b7-a3ac-4034fae5d127', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2774a55a-402a-432a-a7ca-b5914ae4d93b', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 692.521660] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Creating folder: Project (6e4290efc4a24771a2bd3a21f3b431a8). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 692.522790] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f0de22d3-3014-4734-9b88-42be8121f99e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.534073] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Created folder: Project (6e4290efc4a24771a2bd3a21f3b431a8) in parent group-v449629. [ 692.534284] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Creating folder: Instances. Parent ref: group-v449674. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 692.534517] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1bd0316d-5ae2-48ac-b1d1-c4286f46870b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.543903] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Created folder: Instances in parent group-v449674. [ 692.544158] env[60764]: DEBUG oslo.service.loopingcall [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 692.544340] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 692.544540] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7985381b-b918-41f0-8ea6-77bd139d828e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.563879] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 692.563879] env[60764]: value = "task-2204889" [ 692.563879] env[60764]: _type = "Task" [ 692.563879] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 692.571602] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204889, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 693.074140] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204889, 'name': CreateVM_Task, 'duration_secs': 0.381474} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 693.074914] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 693.075132] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 693.075132] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 693.075416] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 693.075651] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-09953e7a-cd7c-4e99-aeeb-4032ab0d7546 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 693.080214] env[60764]: DEBUG oslo_vmware.api [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Waiting for the task: (returnval){ [ 693.080214] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52470444-bdd5-b409-458e-58edd4b3017b" [ 693.080214] env[60764]: _type = "Task" [ 693.080214] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 693.087958] env[60764]: DEBUG oslo_vmware.api [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52470444-bdd5-b409-458e-58edd4b3017b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 693.592173] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 693.592529] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 693.592940] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 693.666347] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Acquiring lock "7a843233-c56c-4d87-aeb0-2ffaa441b021" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 693.666669] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Lock "7a843233-c56c-4d87-aeb0-2ffaa441b021" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 694.335134] env[60764]: DEBUG nova.compute.manager [req-68c934a3-14c9-409a-a420-8c0448d80d85 req-d1b68f22-9d69-4e70-a7dd-9c5fc15ad34d service nova] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Received event network-changed-2774a55a-402a-432a-a7ca-b5914ae4d93b {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 694.335300] env[60764]: DEBUG nova.compute.manager [req-68c934a3-14c9-409a-a420-8c0448d80d85 req-d1b68f22-9d69-4e70-a7dd-9c5fc15ad34d service nova] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Refreshing instance network info cache due to event network-changed-2774a55a-402a-432a-a7ca-b5914ae4d93b. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 694.335452] env[60764]: DEBUG oslo_concurrency.lockutils [req-68c934a3-14c9-409a-a420-8c0448d80d85 req-d1b68f22-9d69-4e70-a7dd-9c5fc15ad34d service nova] Acquiring lock "refresh_cache-437e0c0d-6d0e-4465-9651-14e420b646ae" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 694.335759] env[60764]: DEBUG oslo_concurrency.lockutils [req-68c934a3-14c9-409a-a420-8c0448d80d85 req-d1b68f22-9d69-4e70-a7dd-9c5fc15ad34d service nova] Acquired lock "refresh_cache-437e0c0d-6d0e-4465-9651-14e420b646ae" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 694.335759] env[60764]: DEBUG nova.network.neutron [req-68c934a3-14c9-409a-a420-8c0448d80d85 req-d1b68f22-9d69-4e70-a7dd-9c5fc15ad34d service nova] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Refreshing network info cache for port 2774a55a-402a-432a-a7ca-b5914ae4d93b {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 694.623826] env[60764]: DEBUG nova.network.neutron [req-68c934a3-14c9-409a-a420-8c0448d80d85 req-d1b68f22-9d69-4e70-a7dd-9c5fc15ad34d service nova] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Updated VIF entry in instance network info cache for port 2774a55a-402a-432a-a7ca-b5914ae4d93b. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 694.625082] env[60764]: DEBUG nova.network.neutron [req-68c934a3-14c9-409a-a420-8c0448d80d85 req-d1b68f22-9d69-4e70-a7dd-9c5fc15ad34d service nova] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Updating instance_info_cache with network_info: [{"id": "2774a55a-402a-432a-a7ca-b5914ae4d93b", "address": "fa:16:3e:9c:6f:80", "network": {"id": "6121c79a-dea2-457d-a2c5-70521898dc9c", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1849405885-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6e4290efc4a24771a2bd3a21f3b431a8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4e02e98f-44ce-42b7-a3ac-4034fae5d127", "external-id": "nsx-vlan-transportzone-874", "segmentation_id": 874, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2774a55a-40", "ovs_interfaceid": "2774a55a-402a-432a-a7ca-b5914ae4d93b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 694.638027] env[60764]: DEBUG oslo_concurrency.lockutils [req-68c934a3-14c9-409a-a420-8c0448d80d85 req-d1b68f22-9d69-4e70-a7dd-9c5fc15ad34d service nova] Releasing lock "refresh_cache-437e0c0d-6d0e-4465-9651-14e420b646ae" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 729.330621] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 730.330129] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 730.342489] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 730.342740] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 730.342874] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 730.343044] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 730.344170] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1900cdad-56ae-42d9-946c-51c4f221cf1e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.352985] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5578725-c312-47fa-b982-8da8f22f33ce {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.366691] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67cf0545-423b-4c43-9ce8-7385e66c2b03 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.373147] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b6be52b-3386-4284-a619-31e133b7eca3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.402913] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181275MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 730.403188] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 730.403280] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 730.476036] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 673185d0-9e2c-49dc-8323-f8b30a65b59d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 730.476036] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2696525a-3366-45a1-b413-8e4e0bd9d6c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 730.476036] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance bea83327-9479-46b2-bd78-c81d72359e8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 730.476036] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 8d32e9d3-43ff-47dd-a52b-102edbe3af11 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 730.476255] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7957eb49-d540-4c4e-a86a-1ea3631fb5ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 730.476255] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 4a0b0a82-3910-4201-b1f7-34c862667e3c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 730.476358] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ff2ef5e9-f543-4592-9896-e2c75369a971 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 730.476476] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b846d9ae-759a-4898-9ede-091819325701 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 730.476608] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 730.476738] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 437e0c0d-6d0e-4465-9651-14e420b646ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 730.489265] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 1f11c625-166f-4609-badf-da4dd9475c37 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.498829] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7a378ea2-b981-443a-a925-9819ac5b979f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.507791] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 72e8a8e0-6229-4557-889b-73851a13dbc1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.516728] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 586d9ca2-c287-49bc-bf61-a5140ceaddea has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.525785] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 72225dcc-d210-49cf-8395-056bf4f7f652 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.534843] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7d8b4524-f867-4605-a468-a1c39e77dabd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.544085] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 59c6c887-b973-4a03-ba64-4ad12be45f64 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.552974] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b154a29f-564d-449d-8321-8ded1d4ec29b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.561658] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 330f33a1-cc70-4346-b6c5-e26720ed72f0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.570404] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b24a1521-7fcb-4369-ab8e-211690508a67 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.581087] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 130cd4e5-9df4-4428-abaa-d937d73d9950 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.593885] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c80cde37-2b69-46f7-9ca6-bfa618a18b1e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.603603] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7e3c6624-3c29-4df0-b417-575976b2f0f7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.613863] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6af62a04-4a13-46c7-a0b2-28768c789f23 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.624089] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a48f97b9-8720-4ad2-82a3-bae679b6b2ef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.633656] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 8eb675b6-f73d-47a0-af75-63b5a8800e69 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.646997] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance fdd51c42-e0c1-4ad1-aee9-f091cfec0030 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.657338] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 207825f2-e4aa-4747-bd79-384773b3d516 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.666962] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 57919d5b-769c-4752-873c-d78bb61f5800 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.682128] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.693271] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7a843233-c56c-4d87-aeb0-2ffaa441b021 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 730.693502] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 730.693650] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 731.061119] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bed24d8-59d5-400b-a597-deef02803839 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.069363] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5595ab3c-b37b-4b99-9d8c-1f32fc99cf5f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.098490] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5d842f5-701c-4934-8085-989bb6e1ccba {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.105565] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-751c3547-2f46-4abc-8d0f-a98c734d2fc8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.118315] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 731.126701] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 731.142824] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 731.143024] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.740s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 732.138326] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 732.166273] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 732.166702] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 732.166702] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 732.187545] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.187712] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.187847] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.187971] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.188280] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.188435] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.188560] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.188711] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b846d9ae-759a-4898-9ede-091819325701] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.188879] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.189031] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 732.189159] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 732.189605] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 732.331020] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 732.331020] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 732.331020] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 733.330315] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 733.330590] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 733.330633] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 737.238930] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0dcd7c9-3ef7-4365-8294-806b9c659fc8 tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Acquiring lock "673185d0-9e2c-49dc-8323-f8b30a65b59d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 737.847064] env[60764]: WARNING oslo_vmware.rw_handles [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 737.847064] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 737.847064] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 737.847064] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 737.847064] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 737.847064] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 737.847064] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 737.847064] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 737.847064] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 737.847064] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 737.847064] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 737.847064] env[60764]: ERROR oslo_vmware.rw_handles [ 737.847064] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/101e66c1-0160-4990-ac61-3f219e377cfc/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 737.850207] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 737.850207] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Copying Virtual Disk [datastore2] vmware_temp/101e66c1-0160-4990-ac61-3f219e377cfc/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/101e66c1-0160-4990-ac61-3f219e377cfc/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 737.850207] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-419ab152-06f6-4b48-828f-fa061850ec7d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 737.858607] env[60764]: DEBUG oslo_vmware.api [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Waiting for the task: (returnval){ [ 737.858607] env[60764]: value = "task-2204890" [ 737.858607] env[60764]: _type = "Task" [ 737.858607] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 737.867293] env[60764]: DEBUG oslo_vmware.api [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Task: {'id': task-2204890, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 738.298484] env[60764]: DEBUG oslo_concurrency.lockutils [None req-41aa9f18-5c10-4be4-be63-1ff8771fbdd7 tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Acquiring lock "2696525a-3366-45a1-b413-8e4e0bd9d6c6" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 738.368534] env[60764]: DEBUG oslo_vmware.exceptions [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 738.368851] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 738.369407] env[60764]: ERROR nova.compute.manager [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 738.369407] env[60764]: Faults: ['InvalidArgument'] [ 738.369407] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Traceback (most recent call last): [ 738.369407] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 738.369407] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] yield resources [ 738.369407] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 738.369407] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] self.driver.spawn(context, instance, image_meta, [ 738.369407] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 738.369407] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 738.369407] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 738.369407] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] self._fetch_image_if_missing(context, vi) [ 738.369407] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 738.369745] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] image_cache(vi, tmp_image_ds_loc) [ 738.369745] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 738.369745] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] vm_util.copy_virtual_disk( [ 738.369745] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 738.369745] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] session._wait_for_task(vmdk_copy_task) [ 738.369745] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 738.369745] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] return self.wait_for_task(task_ref) [ 738.369745] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 738.369745] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] return evt.wait() [ 738.369745] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 738.369745] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] result = hub.switch() [ 738.369745] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 738.369745] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] return self.greenlet.switch() [ 738.370320] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 738.370320] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] self.f(*self.args, **self.kw) [ 738.370320] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 738.370320] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] raise exceptions.translate_fault(task_info.error) [ 738.370320] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 738.370320] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Faults: ['InvalidArgument'] [ 738.370320] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] [ 738.370320] env[60764]: INFO nova.compute.manager [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Terminating instance [ 738.371238] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 738.371434] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 738.371673] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-92a7b55f-c14b-4500-aaae-578ff5d8e909 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.373855] env[60764]: DEBUG nova.compute.manager [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 738.374062] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 738.374789] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01b95de9-9179-4840-a26f-145a8edc8c48 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.381566] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 738.381793] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0372782e-1426-4e07-a7d0-8e7a921af483 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.384025] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 738.384203] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 738.385148] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-99e9ce68-de86-4ee0-9b80-ab7e2edf9a13 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.390022] env[60764]: DEBUG oslo_vmware.api [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Waiting for the task: (returnval){ [ 738.390022] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52ab3c7f-f37e-df18-cf0d-c0c9e9069602" [ 738.390022] env[60764]: _type = "Task" [ 738.390022] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 738.397251] env[60764]: DEBUG oslo_vmware.api [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52ab3c7f-f37e-df18-cf0d-c0c9e9069602, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 738.446485] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 738.447340] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 738.447340] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Deleting the datastore file [datastore2] 673185d0-9e2c-49dc-8323-f8b30a65b59d {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 738.447340] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2fadd752-170b-4193-9d7c-7b5ff11e6411 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.453731] env[60764]: DEBUG oslo_vmware.api [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Waiting for the task: (returnval){ [ 738.453731] env[60764]: value = "task-2204892" [ 738.453731] env[60764]: _type = "Task" [ 738.453731] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 738.461591] env[60764]: DEBUG oslo_vmware.api [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Task: {'id': task-2204892, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 738.900536] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 738.900796] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Creating directory with path [datastore2] vmware_temp/def4fe8c-3c43-4cbd-8c85-0732ae70ef99/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 738.901041] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-18599b16-9a8b-472a-9d34-cf06530270dd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.914013] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Created directory with path [datastore2] vmware_temp/def4fe8c-3c43-4cbd-8c85-0732ae70ef99/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 738.914013] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Fetch image to [datastore2] vmware_temp/def4fe8c-3c43-4cbd-8c85-0732ae70ef99/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 738.914013] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/def4fe8c-3c43-4cbd-8c85-0732ae70ef99/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 738.914013] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82405750-8afb-43b4-a3e1-d2d470a2210a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.920305] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-803bac67-6f6d-4636-94f1-ffcc3e47c841 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.929397] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2894165d-be5e-401c-845d-e49a07abd2c1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.964819] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ba20c56-72e0-4575-85e3-826d8b6d3d91 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.972335] env[60764]: DEBUG oslo_vmware.api [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Task: {'id': task-2204892, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.091618} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 738.973848] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 738.974046] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 738.974224] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 738.974392] env[60764]: INFO nova.compute.manager [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 738.976184] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-171341d9-9b40-4ccd-8651-33f374fa072b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 738.979240] env[60764]: DEBUG nova.compute.claims [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 738.979240] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 738.979240] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 738.999951] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 739.056850] env[60764]: DEBUG oslo_vmware.rw_handles [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/def4fe8c-3c43-4cbd-8c85-0732ae70ef99/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 739.119636] env[60764]: DEBUG oslo_vmware.rw_handles [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 739.119636] env[60764]: DEBUG oslo_vmware.rw_handles [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/def4fe8c-3c43-4cbd-8c85-0732ae70ef99/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 739.433506] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57639f68-13a9-4db9-b4ad-7852c123839f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 739.441195] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e18e827b-f2ab-498a-a4ff-d8fcfa82c4fb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 739.470883] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6d575ed-36d2-4f04-bfb4-f9b6fceb7793 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 739.477863] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-627ccfc9-2978-4993-9e3a-61fea07a8863 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 739.491539] env[60764]: DEBUG nova.compute.provider_tree [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 739.500539] env[60764]: DEBUG nova.scheduler.client.report [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 739.519603] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.541s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 739.520177] env[60764]: ERROR nova.compute.manager [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 739.520177] env[60764]: Faults: ['InvalidArgument'] [ 739.520177] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Traceback (most recent call last): [ 739.520177] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 739.520177] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] self.driver.spawn(context, instance, image_meta, [ 739.520177] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 739.520177] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 739.520177] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 739.520177] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] self._fetch_image_if_missing(context, vi) [ 739.520177] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 739.520177] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] image_cache(vi, tmp_image_ds_loc) [ 739.520177] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 739.520510] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] vm_util.copy_virtual_disk( [ 739.520510] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 739.520510] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] session._wait_for_task(vmdk_copy_task) [ 739.520510] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 739.520510] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] return self.wait_for_task(task_ref) [ 739.520510] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 739.520510] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] return evt.wait() [ 739.520510] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 739.520510] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] result = hub.switch() [ 739.520510] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 739.520510] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] return self.greenlet.switch() [ 739.520510] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 739.520510] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] self.f(*self.args, **self.kw) [ 739.520815] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 739.520815] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] raise exceptions.translate_fault(task_info.error) [ 739.520815] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 739.520815] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Faults: ['InvalidArgument'] [ 739.520815] env[60764]: ERROR nova.compute.manager [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] [ 739.520931] env[60764]: DEBUG nova.compute.utils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 739.522315] env[60764]: DEBUG nova.compute.manager [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Build of instance 673185d0-9e2c-49dc-8323-f8b30a65b59d was re-scheduled: A specified parameter was not correct: fileType [ 739.522315] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 739.522694] env[60764]: DEBUG nova.compute.manager [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 739.522863] env[60764]: DEBUG nova.compute.manager [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 739.523042] env[60764]: DEBUG nova.compute.manager [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 739.523208] env[60764]: DEBUG nova.network.neutron [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 739.900363] env[60764]: DEBUG nova.network.neutron [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.912738] env[60764]: INFO nova.compute.manager [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Took 0.39 seconds to deallocate network for instance. [ 740.012935] env[60764]: INFO nova.scheduler.client.report [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Deleted allocations for instance 673185d0-9e2c-49dc-8323-f8b30a65b59d [ 740.034930] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c6bb456-724b-4520-b720-efe62622b62d tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Lock "673185d0-9e2c-49dc-8323-f8b30a65b59d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.637s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 740.035242] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0dcd7c9-3ef7-4365-8294-806b9c659fc8 tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Lock "673185d0-9e2c-49dc-8323-f8b30a65b59d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 2.797s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 740.035457] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0dcd7c9-3ef7-4365-8294-806b9c659fc8 tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Acquiring lock "673185d0-9e2c-49dc-8323-f8b30a65b59d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 740.035673] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0dcd7c9-3ef7-4365-8294-806b9c659fc8 tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Lock "673185d0-9e2c-49dc-8323-f8b30a65b59d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 740.035873] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0dcd7c9-3ef7-4365-8294-806b9c659fc8 tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Lock "673185d0-9e2c-49dc-8323-f8b30a65b59d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 740.038685] env[60764]: INFO nova.compute.manager [None req-f0dcd7c9-3ef7-4365-8294-806b9c659fc8 tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Terminating instance [ 740.040563] env[60764]: DEBUG nova.compute.manager [None req-f0dcd7c9-3ef7-4365-8294-806b9c659fc8 tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 740.040910] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f0dcd7c9-3ef7-4365-8294-806b9c659fc8 tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 740.041020] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0aba43ba-68a2-4023-a69d-067d3c1794c6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 740.046083] env[60764]: DEBUG nova.compute.manager [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 740.052712] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-500692a1-1966-401d-b465-14044a0ebb80 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 740.082327] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-f0dcd7c9-3ef7-4365-8294-806b9c659fc8 tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 673185d0-9e2c-49dc-8323-f8b30a65b59d could not be found. [ 740.082327] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f0dcd7c9-3ef7-4365-8294-806b9c659fc8 tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 740.082327] env[60764]: INFO nova.compute.manager [None req-f0dcd7c9-3ef7-4365-8294-806b9c659fc8 tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 740.082327] env[60764]: DEBUG oslo.service.loopingcall [None req-f0dcd7c9-3ef7-4365-8294-806b9c659fc8 tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 740.082327] env[60764]: DEBUG nova.compute.manager [-] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 740.082539] env[60764]: DEBUG nova.network.neutron [-] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 740.107887] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 740.108145] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 740.109749] env[60764]: INFO nova.compute.claims [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 740.119743] env[60764]: DEBUG nova.network.neutron [-] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 740.129938] env[60764]: INFO nova.compute.manager [-] [instance: 673185d0-9e2c-49dc-8323-f8b30a65b59d] Took 0.05 seconds to deallocate network for instance. [ 740.254564] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0dcd7c9-3ef7-4365-8294-806b9c659fc8 tempest-ImagesOneServerTestJSON-1357980772 tempest-ImagesOneServerTestJSON-1357980772-project-member] Lock "673185d0-9e2c-49dc-8323-f8b30a65b59d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.219s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 740.431410] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ccee0870-4cd5-4dbf-9868-60dd30f99ac4 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquiring lock "bea83327-9479-46b2-bd78-c81d72359e8a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 740.554464] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35faca94-f09d-4595-8244-919c4e6d0ce9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 740.562431] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c140d7ef-0c2b-4974-a31b-276489985a6f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 740.592729] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b7e109b-459b-44a1-bbd9-a8b79246be18 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 740.599500] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c276225-239d-429a-9567-fb18213e6b1f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 740.612994] env[60764]: DEBUG nova.compute.provider_tree [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 740.620561] env[60764]: DEBUG nova.scheduler.client.report [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 740.638102] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.530s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 740.638577] env[60764]: DEBUG nova.compute.manager [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 740.680072] env[60764]: DEBUG nova.compute.utils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 740.680072] env[60764]: DEBUG nova.compute.manager [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 740.680072] env[60764]: DEBUG nova.network.neutron [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 740.690173] env[60764]: DEBUG nova.compute.manager [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 740.773212] env[60764]: DEBUG nova.compute.manager [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 740.803430] env[60764]: DEBUG nova.policy [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0d3b71e7238d402495718b45ca34b574', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4098f4f456f84ec59b7996e94368b83f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 740.809454] env[60764]: DEBUG nova.virt.hardware [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:24:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1094491305',id=21,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-1998069283',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 740.809676] env[60764]: DEBUG nova.virt.hardware [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 740.809851] env[60764]: DEBUG nova.virt.hardware [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 740.810066] env[60764]: DEBUG nova.virt.hardware [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 740.810214] env[60764]: DEBUG nova.virt.hardware [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 740.810364] env[60764]: DEBUG nova.virt.hardware [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 740.810562] env[60764]: DEBUG nova.virt.hardware [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 740.810714] env[60764]: DEBUG nova.virt.hardware [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 740.810875] env[60764]: DEBUG nova.virt.hardware [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 740.811368] env[60764]: DEBUG nova.virt.hardware [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 740.811581] env[60764]: DEBUG nova.virt.hardware [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 740.812733] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2eedd914-bc50-4ca6-b080-c017c7b44a7e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 740.829093] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-441af45a-3367-41c0-ad88-ee2ec8386704 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 741.684319] env[60764]: DEBUG nova.network.neutron [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Successfully created port: 1f5c2817-f590-4f3a-83fa-92eced8925ad {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 742.775532] env[60764]: DEBUG nova.network.neutron [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Successfully updated port: 1f5c2817-f590-4f3a-83fa-92eced8925ad {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 742.778744] env[60764]: DEBUG nova.compute.manager [req-f18a8362-eb79-4871-8f6a-146e6098169a req-c7e4d15c-3e3e-43a4-b124-a4500a922d80 service nova] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Received event network-vif-plugged-1f5c2817-f590-4f3a-83fa-92eced8925ad {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 742.779019] env[60764]: DEBUG oslo_concurrency.lockutils [req-f18a8362-eb79-4871-8f6a-146e6098169a req-c7e4d15c-3e3e-43a4-b124-a4500a922d80 service nova] Acquiring lock "1f11c625-166f-4609-badf-da4dd9475c37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 742.779242] env[60764]: DEBUG oslo_concurrency.lockutils [req-f18a8362-eb79-4871-8f6a-146e6098169a req-c7e4d15c-3e3e-43a4-b124-a4500a922d80 service nova] Lock "1f11c625-166f-4609-badf-da4dd9475c37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 742.779428] env[60764]: DEBUG oslo_concurrency.lockutils [req-f18a8362-eb79-4871-8f6a-146e6098169a req-c7e4d15c-3e3e-43a4-b124-a4500a922d80 service nova] Lock "1f11c625-166f-4609-badf-da4dd9475c37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 742.779531] env[60764]: DEBUG nova.compute.manager [req-f18a8362-eb79-4871-8f6a-146e6098169a req-c7e4d15c-3e3e-43a4-b124-a4500a922d80 service nova] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] No waiting events found dispatching network-vif-plugged-1f5c2817-f590-4f3a-83fa-92eced8925ad {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 742.779684] env[60764]: WARNING nova.compute.manager [req-f18a8362-eb79-4871-8f6a-146e6098169a req-c7e4d15c-3e3e-43a4-b124-a4500a922d80 service nova] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Received unexpected event network-vif-plugged-1f5c2817-f590-4f3a-83fa-92eced8925ad for instance with vm_state building and task_state spawning. [ 742.790259] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Acquiring lock "refresh_cache-1f11c625-166f-4609-badf-da4dd9475c37" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 742.790259] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Acquired lock "refresh_cache-1f11c625-166f-4609-badf-da4dd9475c37" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 742.790259] env[60764]: DEBUG nova.network.neutron [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 743.036659] env[60764]: DEBUG nova.network.neutron [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 743.260097] env[60764]: DEBUG nova.network.neutron [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Updating instance_info_cache with network_info: [{"id": "1f5c2817-f590-4f3a-83fa-92eced8925ad", "address": "fa:16:3e:d7:e9:40", "network": {"id": "a4fffae0-cae6-439e-8d0c-db524a90d423", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1712634310-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4098f4f456f84ec59b7996e94368b83f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1cbd5e0e-9116-46f1-9748-13a73d2d7e75", "external-id": "nsx-vlan-transportzone-690", "segmentation_id": 690, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1f5c2817-f5", "ovs_interfaceid": "1f5c2817-f590-4f3a-83fa-92eced8925ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 743.273680] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Releasing lock "refresh_cache-1f11c625-166f-4609-badf-da4dd9475c37" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 743.273986] env[60764]: DEBUG nova.compute.manager [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Instance network_info: |[{"id": "1f5c2817-f590-4f3a-83fa-92eced8925ad", "address": "fa:16:3e:d7:e9:40", "network": {"id": "a4fffae0-cae6-439e-8d0c-db524a90d423", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1712634310-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4098f4f456f84ec59b7996e94368b83f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1cbd5e0e-9116-46f1-9748-13a73d2d7e75", "external-id": "nsx-vlan-transportzone-690", "segmentation_id": 690, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1f5c2817-f5", "ovs_interfaceid": "1f5c2817-f590-4f3a-83fa-92eced8925ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 743.274426] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d7:e9:40', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1cbd5e0e-9116-46f1-9748-13a73d2d7e75', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1f5c2817-f590-4f3a-83fa-92eced8925ad', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 743.283204] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Creating folder: Project (4098f4f456f84ec59b7996e94368b83f). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 743.283796] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6f45e91e-6c04-4c2f-9397-45c921d1acd3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.297613] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Created folder: Project (4098f4f456f84ec59b7996e94368b83f) in parent group-v449629. [ 743.297790] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Creating folder: Instances. Parent ref: group-v449677. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 743.298085] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f1813105-dd75-4bbc-af38-3b948302b414 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.307829] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Created folder: Instances in parent group-v449677. [ 743.307829] env[60764]: DEBUG oslo.service.loopingcall [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 743.308211] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 743.308211] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c7e2bf16-91fb-4bab-8d42-c30da5d45e06 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 743.330283] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 743.330283] env[60764]: value = "task-2204895" [ 743.330283] env[60764]: _type = "Task" [ 743.330283] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 743.337586] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204895, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 743.838980] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204895, 'name': CreateVM_Task} progress is 99%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 744.340882] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204895, 'name': CreateVM_Task} progress is 99%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 744.842220] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204895, 'name': CreateVM_Task, 'duration_secs': 1.41032} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 744.842537] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 744.843184] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 744.843400] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 744.843761] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 744.844098] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-908277f6-92af-4306-b94c-707c73763f9f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 744.848583] env[60764]: DEBUG oslo_vmware.api [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Waiting for the task: (returnval){ [ 744.848583] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52d667ff-a0d1-bb05-8b20-bb450ae1c80a" [ 744.848583] env[60764]: _type = "Task" [ 744.848583] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 744.862200] env[60764]: DEBUG oslo_vmware.api [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52d667ff-a0d1-bb05-8b20-bb450ae1c80a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 744.990942] env[60764]: DEBUG nova.compute.manager [req-664c3da6-48ed-4e5a-b84b-e9782ae03120 req-cd093908-c0c7-45df-a0c2-6c9512dea1c4 service nova] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Received event network-changed-1f5c2817-f590-4f3a-83fa-92eced8925ad {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 744.990942] env[60764]: DEBUG nova.compute.manager [req-664c3da6-48ed-4e5a-b84b-e9782ae03120 req-cd093908-c0c7-45df-a0c2-6c9512dea1c4 service nova] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Refreshing instance network info cache due to event network-changed-1f5c2817-f590-4f3a-83fa-92eced8925ad. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 744.990942] env[60764]: DEBUG oslo_concurrency.lockutils [req-664c3da6-48ed-4e5a-b84b-e9782ae03120 req-cd093908-c0c7-45df-a0c2-6c9512dea1c4 service nova] Acquiring lock "refresh_cache-1f11c625-166f-4609-badf-da4dd9475c37" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 744.990942] env[60764]: DEBUG oslo_concurrency.lockutils [req-664c3da6-48ed-4e5a-b84b-e9782ae03120 req-cd093908-c0c7-45df-a0c2-6c9512dea1c4 service nova] Acquired lock "refresh_cache-1f11c625-166f-4609-badf-da4dd9475c37" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 744.992253] env[60764]: DEBUG nova.network.neutron [req-664c3da6-48ed-4e5a-b84b-e9782ae03120 req-cd093908-c0c7-45df-a0c2-6c9512dea1c4 service nova] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Refreshing network info cache for port 1f5c2817-f590-4f3a-83fa-92eced8925ad {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 745.365458] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 745.365740] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 745.365951] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 745.598797] env[60764]: DEBUG nova.network.neutron [req-664c3da6-48ed-4e5a-b84b-e9782ae03120 req-cd093908-c0c7-45df-a0c2-6c9512dea1c4 service nova] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Updated VIF entry in instance network info cache for port 1f5c2817-f590-4f3a-83fa-92eced8925ad. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 745.598797] env[60764]: DEBUG nova.network.neutron [req-664c3da6-48ed-4e5a-b84b-e9782ae03120 req-cd093908-c0c7-45df-a0c2-6c9512dea1c4 service nova] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Updating instance_info_cache with network_info: [{"id": "1f5c2817-f590-4f3a-83fa-92eced8925ad", "address": "fa:16:3e:d7:e9:40", "network": {"id": "a4fffae0-cae6-439e-8d0c-db524a90d423", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1712634310-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4098f4f456f84ec59b7996e94368b83f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1cbd5e0e-9116-46f1-9748-13a73d2d7e75", "external-id": "nsx-vlan-transportzone-690", "segmentation_id": 690, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1f5c2817-f5", "ovs_interfaceid": "1f5c2817-f590-4f3a-83fa-92eced8925ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 745.608428] env[60764]: DEBUG oslo_concurrency.lockutils [req-664c3da6-48ed-4e5a-b84b-e9782ae03120 req-cd093908-c0c7-45df-a0c2-6c9512dea1c4 service nova] Releasing lock "refresh_cache-1f11c625-166f-4609-badf-da4dd9475c37" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 749.763122] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Acquiring lock "74b4bba7-8568-4fc4-a744-395a3271abc8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 749.763412] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Lock "74b4bba7-8568-4fc4-a744-395a3271abc8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 750.564614] env[60764]: DEBUG oslo_concurrency.lockutils [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Acquiring lock "8d32e9d3-43ff-47dd-a52b-102edbe3af11" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 751.275445] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e0dfa79d-337e-438b-bfb2-4c621b3cd973 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Acquiring lock "7957eb49-d540-4c4e-a86a-1ea3631fb5ef" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 751.683985] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9cee9a26-3a92-425d-a1a2-5ac1a54c3079 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Acquiring lock "4a0b0a82-3910-4201-b1f7-34c862667e3c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 751.748338] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Acquiring lock "ff2ef5e9-f543-4592-9896-e2c75369a971" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 756.597870] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d09146f6-8779-4d1d-b5f2-374de8b337e7 tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Acquiring lock "b846d9ae-759a-4898-9ede-091819325701" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 758.421389] env[60764]: DEBUG oslo_concurrency.lockutils [None req-669cf744-fea9-4e33-8635-5d16dd18e0d0 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Acquiring lock "4e82aa9c-ae76-4b49-b666-5d5adc22d1b5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 759.371666] env[60764]: DEBUG oslo_concurrency.lockutils [None req-62ef65e4-2609-47c4-9e7c-b51bd5ab9e2a tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Acquiring lock "437e0c0d-6d0e-4465-9651-14e420b646ae" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 759.887049] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8a74b88-0c1e-4096-88c7-51a841a87639 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Acquiring lock "1f11c625-166f-4609-badf-da4dd9475c37" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 766.537625] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "dfd3e3af-90c9-420b-81ec-e9115c519016" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 766.537625] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "dfd3e3af-90c9-420b-81ec-e9115c519016" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 778.359724] env[60764]: DEBUG oslo_concurrency.lockutils [None req-46c1d467-41fe-43a8-b217-524ee57d9c5b tempest-ServersV294TestFqdnHostnames-693022799 tempest-ServersV294TestFqdnHostnames-693022799-project-member] Acquiring lock "79528e3a-72e2-4d7e-913d-bb42d757fe64" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 778.360069] env[60764]: DEBUG oslo_concurrency.lockutils [None req-46c1d467-41fe-43a8-b217-524ee57d9c5b tempest-ServersV294TestFqdnHostnames-693022799 tempest-ServersV294TestFqdnHostnames-693022799-project-member] Lock "79528e3a-72e2-4d7e-913d-bb42d757fe64" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 780.507796] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c204e2a0-dd3b-488c-9457-66cb66b13916 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Acquiring lock "24977d06-906a-4000-9b8f-262085258c6b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 780.508102] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c204e2a0-dd3b-488c-9457-66cb66b13916 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "24977d06-906a-4000-9b8f-262085258c6b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 782.035819] env[60764]: DEBUG oslo_concurrency.lockutils [None req-4e3384b1-c280-4797-aea9-6d13f5ce9222 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Acquiring lock "15d15d22-4ffa-43a1-ab5a-506637d1a3cb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 782.036146] env[60764]: DEBUG oslo_concurrency.lockutils [None req-4e3384b1-c280-4797-aea9-6d13f5ce9222 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "15d15d22-4ffa-43a1-ab5a-506637d1a3cb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 787.530752] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Acquiring lock "d2496c8d-c17c-4178-a2a9-85390aa0bb21" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 787.530752] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Lock "d2496c8d-c17c-4178-a2a9-85390aa0bb21" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 788.229254] env[60764]: DEBUG oslo_concurrency.lockutils [None req-4b7f9121-01c4-4ce3-9ed7-ea23702d8d86 tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Acquiring lock "7cef2173-9a2d-4428-81d2-f13b807967c6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 788.229254] env[60764]: DEBUG oslo_concurrency.lockutils [None req-4b7f9121-01c4-4ce3-9ed7-ea23702d8d86 tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Lock "7cef2173-9a2d-4428-81d2-f13b807967c6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 788.920025] env[60764]: WARNING oslo_vmware.rw_handles [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 788.920025] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 788.920025] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 788.920025] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 788.920025] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 788.920025] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 788.920025] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 788.920025] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 788.920025] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 788.920025] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 788.920025] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 788.920025] env[60764]: ERROR oslo_vmware.rw_handles [ 788.920706] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/def4fe8c-3c43-4cbd-8c85-0732ae70ef99/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 788.922525] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 788.922818] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Copying Virtual Disk [datastore2] vmware_temp/def4fe8c-3c43-4cbd-8c85-0732ae70ef99/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/def4fe8c-3c43-4cbd-8c85-0732ae70ef99/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 788.923174] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e9422d66-879e-4153-8939-7b61534c5b0a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 788.932666] env[60764]: DEBUG oslo_vmware.api [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Waiting for the task: (returnval){ [ 788.932666] env[60764]: value = "task-2204896" [ 788.932666] env[60764]: _type = "Task" [ 788.932666] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 788.946026] env[60764]: DEBUG oslo_vmware.api [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Task: {'id': task-2204896, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 789.331489] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 789.331575] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Cleaning up deleted instances {{(pid=60764) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 789.354229] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] There are 0 instances to clean {{(pid=60764) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 789.354447] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 789.354558] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Cleaning up deleted instances with incomplete migration {{(pid=60764) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 789.370501] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 789.445057] env[60764]: DEBUG oslo_vmware.exceptions [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 789.445546] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 789.446154] env[60764]: ERROR nova.compute.manager [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 789.446154] env[60764]: Faults: ['InvalidArgument'] [ 789.446154] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Traceback (most recent call last): [ 789.446154] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 789.446154] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] yield resources [ 789.446154] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 789.446154] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] self.driver.spawn(context, instance, image_meta, [ 789.446154] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 789.446154] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 789.446154] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 789.446154] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] self._fetch_image_if_missing(context, vi) [ 789.446154] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 789.446572] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] image_cache(vi, tmp_image_ds_loc) [ 789.446572] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 789.446572] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] vm_util.copy_virtual_disk( [ 789.446572] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 789.446572] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] session._wait_for_task(vmdk_copy_task) [ 789.446572] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 789.446572] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] return self.wait_for_task(task_ref) [ 789.446572] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 789.446572] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] return evt.wait() [ 789.446572] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 789.446572] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] result = hub.switch() [ 789.446572] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 789.446572] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] return self.greenlet.switch() [ 789.446935] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 789.446935] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] self.f(*self.args, **self.kw) [ 789.446935] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 789.446935] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] raise exceptions.translate_fault(task_info.error) [ 789.446935] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 789.446935] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Faults: ['InvalidArgument'] [ 789.446935] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] [ 789.446935] env[60764]: INFO nova.compute.manager [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Terminating instance [ 789.449161] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 789.449325] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 789.450527] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b586435d-37f8-489f-b2f6-8df0d84e4875 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 789.457031] env[60764]: DEBUG nova.compute.manager [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 789.457031] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 789.457786] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1e1b969-da14-4126-91c3-82053131760e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 789.469356] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 789.471552] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6551e4b3-940d-4290-8ee2-b268214c86cd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 789.471714] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 789.471921] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 789.472744] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-973d74a0-927b-4bd0-8c1a-50b450734459 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 789.478432] env[60764]: DEBUG oslo_vmware.api [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Waiting for the task: (returnval){ [ 789.478432] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52ae01ff-476f-d46d-ad94-da3d01e7ed24" [ 789.478432] env[60764]: _type = "Task" [ 789.478432] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 789.486482] env[60764]: DEBUG oslo_vmware.api [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52ae01ff-476f-d46d-ad94-da3d01e7ed24, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 789.543233] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 789.546147] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 789.546499] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Deleting the datastore file [datastore2] 2696525a-3366-45a1-b413-8e4e0bd9d6c6 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 789.546798] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-22269f4d-6806-4f86-9f35-ceaba7786422 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 789.553623] env[60764]: DEBUG oslo_vmware.api [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Waiting for the task: (returnval){ [ 789.553623] env[60764]: value = "task-2204898" [ 789.553623] env[60764]: _type = "Task" [ 789.553623] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 789.562495] env[60764]: DEBUG oslo_vmware.api [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Task: {'id': task-2204898, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 789.629147] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Acquiring lock "e80dd396-f709-48d7-bc98-159b175f5593" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 789.629405] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Lock "e80dd396-f709-48d7-bc98-159b175f5593" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 789.990894] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 789.991202] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Creating directory with path [datastore2] vmware_temp/2c313b3e-bcba-4dd1-86b0-63a8392a3215/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 789.991452] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1414e540-dbfe-4859-bb24-de4ad4ec83be {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 790.003567] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Created directory with path [datastore2] vmware_temp/2c313b3e-bcba-4dd1-86b0-63a8392a3215/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 790.003656] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Fetch image to [datastore2] vmware_temp/2c313b3e-bcba-4dd1-86b0-63a8392a3215/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 790.003781] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/2c313b3e-bcba-4dd1-86b0-63a8392a3215/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 790.004915] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b656f96b-c5e3-4eb9-8b97-db0ce5a90ad2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 790.012509] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3f62400-308f-4a12-a17e-d448f066041e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 790.021492] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-452e6001-5c6f-4822-863a-442b4a616df3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 790.059325] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c2f2c13-59e0-445b-8807-d7608cb487ce {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 790.066038] env[60764]: DEBUG oslo_vmware.api [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Task: {'id': task-2204898, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07788} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 790.067087] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 790.067290] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 790.067471] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 790.067640] env[60764]: INFO nova.compute.manager [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Took 0.61 seconds to destroy the instance on the hypervisor. [ 790.069513] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d56e9222-6609-4bd8-9be4-c09e2051a512 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 790.071780] env[60764]: DEBUG nova.compute.claims [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 790.071928] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 790.072171] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 790.106029] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 790.186727] env[60764]: DEBUG oslo_vmware.rw_handles [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2c313b3e-bcba-4dd1-86b0-63a8392a3215/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 790.258686] env[60764]: DEBUG oslo_vmware.rw_handles [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 790.258686] env[60764]: DEBUG oslo_vmware.rw_handles [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2c313b3e-bcba-4dd1-86b0-63a8392a3215/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 790.734309] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54ede603-1c01-4352-b3df-fa7961333cee {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 790.745436] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c163bfcc-364f-4c61-b7e2-324f5b027fd1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 790.780364] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7ddb673-9163-4796-8344-934815fe5d8d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 790.786418] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42c0a36d-6a71-431d-9d16-b1842689d7e0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 790.803809] env[60764]: DEBUG nova.compute.provider_tree [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 790.815590] env[60764]: DEBUG nova.scheduler.client.report [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 790.849794] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.777s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 790.850400] env[60764]: ERROR nova.compute.manager [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 790.850400] env[60764]: Faults: ['InvalidArgument'] [ 790.850400] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Traceback (most recent call last): [ 790.850400] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 790.850400] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] self.driver.spawn(context, instance, image_meta, [ 790.850400] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 790.850400] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 790.850400] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 790.850400] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] self._fetch_image_if_missing(context, vi) [ 790.850400] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 790.850400] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] image_cache(vi, tmp_image_ds_loc) [ 790.850400] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 790.851046] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] vm_util.copy_virtual_disk( [ 790.851046] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 790.851046] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] session._wait_for_task(vmdk_copy_task) [ 790.851046] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 790.851046] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] return self.wait_for_task(task_ref) [ 790.851046] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 790.851046] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] return evt.wait() [ 790.851046] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 790.851046] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] result = hub.switch() [ 790.851046] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 790.851046] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] return self.greenlet.switch() [ 790.851046] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 790.851046] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] self.f(*self.args, **self.kw) [ 790.851726] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 790.851726] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] raise exceptions.translate_fault(task_info.error) [ 790.851726] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 790.851726] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Faults: ['InvalidArgument'] [ 790.851726] env[60764]: ERROR nova.compute.manager [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] [ 790.851726] env[60764]: DEBUG nova.compute.utils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 790.853327] env[60764]: DEBUG nova.compute.manager [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Build of instance 2696525a-3366-45a1-b413-8e4e0bd9d6c6 was re-scheduled: A specified parameter was not correct: fileType [ 790.853327] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 790.853451] env[60764]: DEBUG nova.compute.manager [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 790.853594] env[60764]: DEBUG nova.compute.manager [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 790.853742] env[60764]: DEBUG nova.compute.manager [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 790.853897] env[60764]: DEBUG nova.network.neutron [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 791.378800] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 791.379103] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 791.379869] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 791.406114] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 791.406314] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 791.406497] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 791.406649] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 791.406980] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 791.406980] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b846d9ae-759a-4898-9ede-091819325701] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 791.407105] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 791.407206] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 791.410036] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 791.410036] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 791.410036] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 791.410036] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 791.410036] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 791.424552] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 791.424772] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 791.424930] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 791.425094] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 791.426412] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8eedaa23-69ce-4666-88ad-f3c6fc46ff7f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 791.437022] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f99eb615-3b48-407a-8329-e8645c52efd5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 791.450647] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17c425a4-678f-49d6-a0f3-77f94832031a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 791.457630] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0ba37e8-03f9-4dc8-8962-e42ebb93a99a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 791.497832] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181281MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 791.497995] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 791.498220] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 791.583111] env[60764]: DEBUG nova.network.neutron [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 791.599337] env[60764]: INFO nova.compute.manager [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Took 0.75 seconds to deallocate network for instance. [ 791.615022] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2696525a-3366-45a1-b413-8e4e0bd9d6c6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 791.615022] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance bea83327-9479-46b2-bd78-c81d72359e8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 791.615022] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 8d32e9d3-43ff-47dd-a52b-102edbe3af11 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 791.615022] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7957eb49-d540-4c4e-a86a-1ea3631fb5ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 791.615320] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 4a0b0a82-3910-4201-b1f7-34c862667e3c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 791.615320] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ff2ef5e9-f543-4592-9896-e2c75369a971 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 791.615320] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b846d9ae-759a-4898-9ede-091819325701 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 791.615320] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 791.615510] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 437e0c0d-6d0e-4465-9651-14e420b646ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 791.615510] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 1f11c625-166f-4609-badf-da4dd9475c37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 791.628032] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7e3c6624-3c29-4df0-b417-575976b2f0f7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 791.656935] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6af62a04-4a13-46c7-a0b2-28768c789f23 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 791.675678] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a48f97b9-8720-4ad2-82a3-bae679b6b2ef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 791.688275] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 8eb675b6-f73d-47a0-af75-63b5a8800e69 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 791.710013] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance fdd51c42-e0c1-4ad1-aee9-f091cfec0030 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 791.722367] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 207825f2-e4aa-4747-bd79-384773b3d516 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 791.738901] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 57919d5b-769c-4752-873c-d78bb61f5800 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 791.752419] env[60764]: INFO nova.scheduler.client.report [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Deleted allocations for instance 2696525a-3366-45a1-b413-8e4e0bd9d6c6 [ 791.758119] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 791.773677] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7a843233-c56c-4d87-aeb0-2ffaa441b021 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 791.785043] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 74b4bba7-8568-4fc4-a744-395a3271abc8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 791.819634] env[60764]: DEBUG oslo_concurrency.lockutils [None req-de22c013-7f8c-4169-afeb-6b1a845beeba tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Lock "2696525a-3366-45a1-b413-8e4e0bd9d6c6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 251.564s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 791.820915] env[60764]: DEBUG oslo_concurrency.lockutils [None req-41aa9f18-5c10-4be4-be63-1ff8771fbdd7 tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Lock "2696525a-3366-45a1-b413-8e4e0bd9d6c6" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 53.523s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 791.821157] env[60764]: DEBUG oslo_concurrency.lockutils [None req-41aa9f18-5c10-4be4-be63-1ff8771fbdd7 tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Acquiring lock "2696525a-3366-45a1-b413-8e4e0bd9d6c6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 791.821357] env[60764]: DEBUG oslo_concurrency.lockutils [None req-41aa9f18-5c10-4be4-be63-1ff8771fbdd7 tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Lock "2696525a-3366-45a1-b413-8e4e0bd9d6c6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 791.821513] env[60764]: DEBUG oslo_concurrency.lockutils [None req-41aa9f18-5c10-4be4-be63-1ff8771fbdd7 tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Lock "2696525a-3366-45a1-b413-8e4e0bd9d6c6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 791.826059] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance dfd3e3af-90c9-420b-81ec-e9115c519016 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 791.827578] env[60764]: INFO nova.compute.manager [None req-41aa9f18-5c10-4be4-be63-1ff8771fbdd7 tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Terminating instance [ 791.830583] env[60764]: DEBUG nova.compute.manager [None req-41aa9f18-5c10-4be4-be63-1ff8771fbdd7 tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 791.830812] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-41aa9f18-5c10-4be4-be63-1ff8771fbdd7 tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 791.831398] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f7d21259-9658-47dc-a4d2-ed1e30bf4884 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 791.835540] env[60764]: DEBUG nova.compute.manager [None req-40a3af0c-0c78-47a5-b816-503530f06d87 tempest-VolumesAssistedSnapshotsTest-1009492998 tempest-VolumesAssistedSnapshotsTest-1009492998-project-member] [instance: 7a378ea2-b981-443a-a925-9819ac5b979f] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 791.840544] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 79528e3a-72e2-4d7e-913d-bb42d757fe64 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 791.848078] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb3e9a1a-dc15-45b4-a71c-dcbabb93cf53 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 791.860996] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 15d15d22-4ffa-43a1-ab5a-506637d1a3cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 791.883102] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-41aa9f18-5c10-4be4-be63-1ff8771fbdd7 tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2696525a-3366-45a1-b413-8e4e0bd9d6c6 could not be found. [ 791.884072] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-41aa9f18-5c10-4be4-be63-1ff8771fbdd7 tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 791.884072] env[60764]: INFO nova.compute.manager [None req-41aa9f18-5c10-4be4-be63-1ff8771fbdd7 tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Took 0.05 seconds to destroy the instance on the hypervisor. [ 791.884072] env[60764]: DEBUG oslo.service.loopingcall [None req-41aa9f18-5c10-4be4-be63-1ff8771fbdd7 tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 791.884241] env[60764]: DEBUG nova.compute.manager [None req-40a3af0c-0c78-47a5-b816-503530f06d87 tempest-VolumesAssistedSnapshotsTest-1009492998 tempest-VolumesAssistedSnapshotsTest-1009492998-project-member] [instance: 7a378ea2-b981-443a-a925-9819ac5b979f] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 791.885528] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance d2496c8d-c17c-4178-a2a9-85390aa0bb21 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 791.886749] env[60764]: DEBUG nova.compute.manager [-] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 791.886796] env[60764]: DEBUG nova.network.neutron [-] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 791.899530] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7cef2173-9a2d-4428-81d2-f13b807967c6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 791.917732] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e80dd396-f709-48d7-bc98-159b175f5593 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 791.917979] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 791.918144] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 791.928480] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40a3af0c-0c78-47a5-b816-503530f06d87 tempest-VolumesAssistedSnapshotsTest-1009492998 tempest-VolumesAssistedSnapshotsTest-1009492998-project-member] Lock "7a378ea2-b981-443a-a925-9819ac5b979f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.763s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 791.941873] env[60764]: DEBUG nova.network.neutron [-] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 791.951308] env[60764]: DEBUG nova.compute.manager [None req-027ed7af-4498-4459-884b-21717a6ec8ef tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: 72e8a8e0-6229-4557-889b-73851a13dbc1] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 791.959303] env[60764]: INFO nova.compute.manager [-] [instance: 2696525a-3366-45a1-b413-8e4e0bd9d6c6] Took 0.07 seconds to deallocate network for instance. [ 791.984886] env[60764]: DEBUG nova.compute.manager [None req-027ed7af-4498-4459-884b-21717a6ec8ef tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: 72e8a8e0-6229-4557-889b-73851a13dbc1] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 792.023728] env[60764]: DEBUG oslo_concurrency.lockutils [None req-027ed7af-4498-4459-884b-21717a6ec8ef tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "72e8a8e0-6229-4557-889b-73851a13dbc1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 225.172s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 792.041105] env[60764]: DEBUG nova.compute.manager [None req-0a95c3d4-f4c1-4317-8f29-d690850754f8 tempest-ImagesNegativeTestJSON-1879878930 tempest-ImagesNegativeTestJSON-1879878930-project-member] [instance: 586d9ca2-c287-49bc-bf61-a5140ceaddea] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 792.071253] env[60764]: DEBUG nova.compute.manager [None req-0a95c3d4-f4c1-4317-8f29-d690850754f8 tempest-ImagesNegativeTestJSON-1879878930 tempest-ImagesNegativeTestJSON-1879878930-project-member] [instance: 586d9ca2-c287-49bc-bf61-a5140ceaddea] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 792.119411] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0a95c3d4-f4c1-4317-8f29-d690850754f8 tempest-ImagesNegativeTestJSON-1879878930 tempest-ImagesNegativeTestJSON-1879878930-project-member] Lock "586d9ca2-c287-49bc-bf61-a5140ceaddea" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.584s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 792.157031] env[60764]: DEBUG nova.compute.manager [None req-b9285dae-455e-4eb2-ae5b-72bd4e60b168 tempest-AttachInterfacesUnderV243Test-1999804320 tempest-AttachInterfacesUnderV243Test-1999804320-project-member] [instance: 72225dcc-d210-49cf-8395-056bf4f7f652] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 792.181522] env[60764]: DEBUG oslo_concurrency.lockutils [None req-41aa9f18-5c10-4be4-be63-1ff8771fbdd7 tempest-ServerExternalEventsTest-1190521688 tempest-ServerExternalEventsTest-1190521688-project-member] Lock "2696525a-3366-45a1-b413-8e4e0bd9d6c6" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.360s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 792.206659] env[60764]: DEBUG nova.compute.manager [None req-b9285dae-455e-4eb2-ae5b-72bd4e60b168 tempest-AttachInterfacesUnderV243Test-1999804320 tempest-AttachInterfacesUnderV243Test-1999804320-project-member] [instance: 72225dcc-d210-49cf-8395-056bf4f7f652] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 792.241239] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b9285dae-455e-4eb2-ae5b-72bd4e60b168 tempest-AttachInterfacesUnderV243Test-1999804320 tempest-AttachInterfacesUnderV243Test-1999804320-project-member] Lock "72225dcc-d210-49cf-8395-056bf4f7f652" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 223.466s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 792.259780] env[60764]: DEBUG nova.compute.manager [None req-25dbeba2-7296-40d9-acb6-05202f9bb1da tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: 7d8b4524-f867-4605-a468-a1c39e77dabd] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 792.291655] env[60764]: DEBUG nova.compute.manager [None req-25dbeba2-7296-40d9-acb6-05202f9bb1da tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: 7d8b4524-f867-4605-a468-a1c39e77dabd] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 792.320750] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25dbeba2-7296-40d9-acb6-05202f9bb1da tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Lock "7d8b4524-f867-4605-a468-a1c39e77dabd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.250s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 792.338138] env[60764]: DEBUG nova.compute.manager [None req-48384ae7-d525-4af3-8a30-c942e3265f51 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: 59c6c887-b973-4a03-ba64-4ad12be45f64] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 792.387724] env[60764]: DEBUG nova.compute.manager [None req-48384ae7-d525-4af3-8a30-c942e3265f51 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: 59c6c887-b973-4a03-ba64-4ad12be45f64] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 792.428626] env[60764]: DEBUG oslo_concurrency.lockutils [None req-48384ae7-d525-4af3-8a30-c942e3265f51 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "59c6c887-b973-4a03-ba64-4ad12be45f64" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.919s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 792.457673] env[60764]: DEBUG nova.compute.manager [None req-fdedc5e8-8f91-4526-a378-611a505be85d tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b154a29f-564d-449d-8321-8ded1d4ec29b] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 792.466072] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5c02a14-4e42-4f65-b74c-99d6be1c643a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 792.477521] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c798c70-70f3-4ae6-a3cd-6a8ed5dc6789 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 792.533856] env[60764]: DEBUG nova.compute.manager [None req-fdedc5e8-8f91-4526-a378-611a505be85d tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b154a29f-564d-449d-8321-8ded1d4ec29b] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 792.536018] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1533763-48b5-4618-9739-8b717c20588a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 792.549585] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36cbe566-6216-4e16-85f5-7e53d5cc7516 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 792.572156] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 792.585233] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 792.590430] env[60764]: DEBUG oslo_concurrency.lockutils [None req-fdedc5e8-8f91-4526-a378-611a505be85d tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Lock "b154a29f-564d-449d-8321-8ded1d4ec29b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.882s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 792.611788] env[60764]: DEBUG nova.compute.manager [None req-79ee51a8-a915-41e4-83df-367e0bef2851 tempest-InstanceActionsV221TestJSON-2132129667 tempest-InstanceActionsV221TestJSON-2132129667-project-member] [instance: 330f33a1-cc70-4346-b6c5-e26720ed72f0] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 792.615785] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 792.615949] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.118s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 792.643173] env[60764]: DEBUG nova.compute.manager [None req-79ee51a8-a915-41e4-83df-367e0bef2851 tempest-InstanceActionsV221TestJSON-2132129667 tempest-InstanceActionsV221TestJSON-2132129667-project-member] [instance: 330f33a1-cc70-4346-b6c5-e26720ed72f0] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 792.676248] env[60764]: DEBUG oslo_concurrency.lockutils [None req-79ee51a8-a915-41e4-83df-367e0bef2851 tempest-InstanceActionsV221TestJSON-2132129667 tempest-InstanceActionsV221TestJSON-2132129667-project-member] Lock "330f33a1-cc70-4346-b6c5-e26720ed72f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.315s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 792.693257] env[60764]: DEBUG nova.compute.manager [None req-bc55a72e-1728-4f37-8111-89c2f67847de tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: b24a1521-7fcb-4369-ab8e-211690508a67] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 792.724091] env[60764]: DEBUG nova.compute.manager [None req-bc55a72e-1728-4f37-8111-89c2f67847de tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: b24a1521-7fcb-4369-ab8e-211690508a67] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 792.753663] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bc55a72e-1728-4f37-8111-89c2f67847de tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Lock "b24a1521-7fcb-4369-ab8e-211690508a67" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.151s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 792.765088] env[60764]: DEBUG nova.compute.manager [None req-4e4a1568-531d-44a0-9c82-d6ee4b410c79 tempest-ServersTestManualDisk-764844659 tempest-ServersTestManualDisk-764844659-project-member] [instance: 130cd4e5-9df4-4428-abaa-d937d73d9950] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 792.775883] env[60764]: DEBUG oslo_concurrency.lockutils [None req-6d13e3cf-d904-45ff-b7b3-87e87d8212be tempest-ListServersNegativeTestJSON-153891157 tempest-ListServersNegativeTestJSON-153891157-project-member] Acquiring lock "b2e4096c-0edc-44ac-a4b6-3a32e0466c54" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 792.776147] env[60764]: DEBUG oslo_concurrency.lockutils [None req-6d13e3cf-d904-45ff-b7b3-87e87d8212be tempest-ListServersNegativeTestJSON-153891157 tempest-ListServersNegativeTestJSON-153891157-project-member] Lock "b2e4096c-0edc-44ac-a4b6-3a32e0466c54" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 792.799990] env[60764]: DEBUG nova.compute.manager [None req-4e4a1568-531d-44a0-9c82-d6ee4b410c79 tempest-ServersTestManualDisk-764844659 tempest-ServersTestManualDisk-764844659-project-member] [instance: 130cd4e5-9df4-4428-abaa-d937d73d9950] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 792.827964] env[60764]: DEBUG oslo_concurrency.lockutils [None req-6d13e3cf-d904-45ff-b7b3-87e87d8212be tempest-ListServersNegativeTestJSON-153891157 tempest-ListServersNegativeTestJSON-153891157-project-member] Acquiring lock "30aca955-c304-43e8-8da1-91cd7f4e1b38" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 792.828442] env[60764]: DEBUG oslo_concurrency.lockutils [None req-6d13e3cf-d904-45ff-b7b3-87e87d8212be tempest-ListServersNegativeTestJSON-153891157 tempest-ListServersNegativeTestJSON-153891157-project-member] Lock "30aca955-c304-43e8-8da1-91cd7f4e1b38" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 792.859136] env[60764]: DEBUG oslo_concurrency.lockutils [None req-6d13e3cf-d904-45ff-b7b3-87e87d8212be tempest-ListServersNegativeTestJSON-153891157 tempest-ListServersNegativeTestJSON-153891157-project-member] Acquiring lock "c5627bb3-36c8-415f-bf4e-449adedd5ba6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 792.859136] env[60764]: DEBUG oslo_concurrency.lockutils [None req-6d13e3cf-d904-45ff-b7b3-87e87d8212be tempest-ListServersNegativeTestJSON-153891157 tempest-ListServersNegativeTestJSON-153891157-project-member] Lock "c5627bb3-36c8-415f-bf4e-449adedd5ba6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 792.972168] env[60764]: DEBUG oslo_concurrency.lockutils [None req-4e4a1568-531d-44a0-9c82-d6ee4b410c79 tempest-ServersTestManualDisk-764844659 tempest-ServersTestManualDisk-764844659-project-member] Lock "130cd4e5-9df4-4428-abaa-d937d73d9950" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.652s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 792.986506] env[60764]: DEBUG nova.compute.manager [None req-86d04512-f5cc-4259-83af-39ddeffd2e29 tempest-ServersAaction247Test-1656782549 tempest-ServersAaction247Test-1656782549-project-member] [instance: c80cde37-2b69-46f7-9ca6-bfa618a18b1e] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 793.014835] env[60764]: DEBUG nova.compute.manager [None req-86d04512-f5cc-4259-83af-39ddeffd2e29 tempest-ServersAaction247Test-1656782549 tempest-ServersAaction247Test-1656782549-project-member] [instance: c80cde37-2b69-46f7-9ca6-bfa618a18b1e] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 793.042615] env[60764]: DEBUG oslo_concurrency.lockutils [None req-86d04512-f5cc-4259-83af-39ddeffd2e29 tempest-ServersAaction247Test-1656782549 tempest-ServersAaction247Test-1656782549-project-member] Lock "c80cde37-2b69-46f7-9ca6-bfa618a18b1e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.574s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 793.054142] env[60764]: DEBUG nova.compute.manager [None req-32d25acc-108a-4cee-987b-6f738651cc19 tempest-ServerAddressesTestJSON-1852641836 tempest-ServerAddressesTestJSON-1852641836-project-member] [instance: 7e3c6624-3c29-4df0-b417-575976b2f0f7] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 793.082906] env[60764]: DEBUG nova.compute.manager [None req-32d25acc-108a-4cee-987b-6f738651cc19 tempest-ServerAddressesTestJSON-1852641836 tempest-ServerAddressesTestJSON-1852641836-project-member] [instance: 7e3c6624-3c29-4df0-b417-575976b2f0f7] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 793.106275] env[60764]: DEBUG oslo_concurrency.lockutils [None req-32d25acc-108a-4cee-987b-6f738651cc19 tempest-ServerAddressesTestJSON-1852641836 tempest-ServerAddressesTestJSON-1852641836-project-member] Lock "7e3c6624-3c29-4df0-b417-575976b2f0f7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.645s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 793.119441] env[60764]: DEBUG nova.compute.manager [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 793.190016] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 793.190016] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 793.190016] env[60764]: INFO nova.compute.claims [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 793.268445] env[60764]: DEBUG nova.scheduler.client.report [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Refreshing inventories for resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 793.290013] env[60764]: DEBUG nova.scheduler.client.report [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Updating ProviderTree inventory for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 793.290277] env[60764]: DEBUG nova.compute.provider_tree [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Updating inventory in ProviderTree for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 793.305997] env[60764]: DEBUG nova.scheduler.client.report [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Refreshing aggregate associations for resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4, aggregates: None {{(pid=60764) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 793.325492] env[60764]: DEBUG nova.scheduler.client.report [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Refreshing trait associations for resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_VMDK {{(pid=60764) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 793.538235] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 793.538508] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 793.538608] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 793.770752] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5704dee8-0af6-40ea-ac8d-3f7b6f22e270 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 793.784594] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a90bd579-9eb3-463e-a98b-61d18349f781 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 793.818872] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1e44fcc-bf05-43f5-920f-e275bfbbffeb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 793.827704] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1c4a880-04a1-4141-ad98-4edd89e73c97 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 793.846786] env[60764]: DEBUG nova.compute.provider_tree [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 793.859032] env[60764]: DEBUG nova.scheduler.client.report [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 793.875649] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.688s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 793.876180] env[60764]: DEBUG nova.compute.manager [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 793.922426] env[60764]: DEBUG nova.compute.utils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 793.923700] env[60764]: DEBUG nova.compute.manager [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 793.923875] env[60764]: DEBUG nova.network.neutron [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 793.942694] env[60764]: DEBUG nova.compute.manager [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 794.016983] env[60764]: DEBUG nova.policy [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ca02dd26c1a241a6a607ca2867db299d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fc214d2435e94f6cbf7ca16207861ab3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 794.031581] env[60764]: DEBUG nova.compute.manager [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 794.063998] env[60764]: DEBUG nova.virt.hardware [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 794.064258] env[60764]: DEBUG nova.virt.hardware [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 794.064411] env[60764]: DEBUG nova.virt.hardware [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 794.064584] env[60764]: DEBUG nova.virt.hardware [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 794.064723] env[60764]: DEBUG nova.virt.hardware [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 794.064865] env[60764]: DEBUG nova.virt.hardware [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 794.065414] env[60764]: DEBUG nova.virt.hardware [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 794.065628] env[60764]: DEBUG nova.virt.hardware [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 794.065810] env[60764]: DEBUG nova.virt.hardware [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 794.065978] env[60764]: DEBUG nova.virt.hardware [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 794.066173] env[60764]: DEBUG nova.virt.hardware [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 794.067091] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a167b08d-c130-49ee-b5e4-764d6f538b3f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 794.075959] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-646cedb4-5621-48a1-85c0-ef35ff1f9d1a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 794.329019] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 794.329439] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 794.329585] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 795.440441] env[60764]: DEBUG nova.network.neutron [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Successfully created port: 1c79fd0c-a2f5-4377-bc4d-7307a073163e {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 795.789821] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Acquiring lock "6af62a04-4a13-46c7-a0b2-28768c789f23" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 797.924275] env[60764]: DEBUG nova.network.neutron [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Successfully updated port: 1c79fd0c-a2f5-4377-bc4d-7307a073163e {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 797.934293] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Acquiring lock "refresh_cache-6af62a04-4a13-46c7-a0b2-28768c789f23" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 797.934481] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Acquired lock "refresh_cache-6af62a04-4a13-46c7-a0b2-28768c789f23" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 797.934594] env[60764]: DEBUG nova.network.neutron [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 798.028329] env[60764]: DEBUG nova.network.neutron [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 798.072089] env[60764]: DEBUG nova.compute.manager [req-b28bec65-9727-4fbb-8717-11b2b0035283 req-51bd79fc-0c4f-435a-928a-95221c302543 service nova] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Received event network-vif-plugged-1c79fd0c-a2f5-4377-bc4d-7307a073163e {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 798.072089] env[60764]: DEBUG oslo_concurrency.lockutils [req-b28bec65-9727-4fbb-8717-11b2b0035283 req-51bd79fc-0c4f-435a-928a-95221c302543 service nova] Acquiring lock "6af62a04-4a13-46c7-a0b2-28768c789f23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 798.072089] env[60764]: DEBUG oslo_concurrency.lockutils [req-b28bec65-9727-4fbb-8717-11b2b0035283 req-51bd79fc-0c4f-435a-928a-95221c302543 service nova] Lock "6af62a04-4a13-46c7-a0b2-28768c789f23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 798.072089] env[60764]: DEBUG oslo_concurrency.lockutils [req-b28bec65-9727-4fbb-8717-11b2b0035283 req-51bd79fc-0c4f-435a-928a-95221c302543 service nova] Lock "6af62a04-4a13-46c7-a0b2-28768c789f23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 798.072349] env[60764]: DEBUG nova.compute.manager [req-b28bec65-9727-4fbb-8717-11b2b0035283 req-51bd79fc-0c4f-435a-928a-95221c302543 service nova] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] No waiting events found dispatching network-vif-plugged-1c79fd0c-a2f5-4377-bc4d-7307a073163e {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 798.072349] env[60764]: WARNING nova.compute.manager [req-b28bec65-9727-4fbb-8717-11b2b0035283 req-51bd79fc-0c4f-435a-928a-95221c302543 service nova] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Received unexpected event network-vif-plugged-1c79fd0c-a2f5-4377-bc4d-7307a073163e for instance with vm_state building and task_state deleting. [ 798.773019] env[60764]: DEBUG nova.network.neutron [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Updating instance_info_cache with network_info: [{"id": "1c79fd0c-a2f5-4377-bc4d-7307a073163e", "address": "fa:16:3e:93:c7:18", "network": {"id": "ad60a611-6fd0-415d-9a3e-f0e92576ce30", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-2073397685-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fc214d2435e94f6cbf7ca16207861ab3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e2153f70-3d14-42ab-8bb3-be78296dd3b8", "external-id": "nsx-vlan-transportzone-532", "segmentation_id": 532, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1c79fd0c-a2", "ovs_interfaceid": "1c79fd0c-a2f5-4377-bc4d-7307a073163e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 798.790379] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Releasing lock "refresh_cache-6af62a04-4a13-46c7-a0b2-28768c789f23" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 798.791155] env[60764]: DEBUG nova.compute.manager [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Instance network_info: |[{"id": "1c79fd0c-a2f5-4377-bc4d-7307a073163e", "address": "fa:16:3e:93:c7:18", "network": {"id": "ad60a611-6fd0-415d-9a3e-f0e92576ce30", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-2073397685-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fc214d2435e94f6cbf7ca16207861ab3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e2153f70-3d14-42ab-8bb3-be78296dd3b8", "external-id": "nsx-vlan-transportzone-532", "segmentation_id": 532, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1c79fd0c-a2", "ovs_interfaceid": "1c79fd0c-a2f5-4377-bc4d-7307a073163e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 798.791659] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:93:c7:18', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e2153f70-3d14-42ab-8bb3-be78296dd3b8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1c79fd0c-a2f5-4377-bc4d-7307a073163e', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 798.801881] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Creating folder: Project (fc214d2435e94f6cbf7ca16207861ab3). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 798.803677] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c47de40c-caa5-4354-a9bb-25944c796073 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.820983] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Created folder: Project (fc214d2435e94f6cbf7ca16207861ab3) in parent group-v449629. [ 798.821272] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Creating folder: Instances. Parent ref: group-v449680. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 798.822119] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bb459749-fd71-4637-b124-cbb4e25db8ef {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.834023] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Created folder: Instances in parent group-v449680. [ 798.834023] env[60764]: DEBUG oslo.service.loopingcall [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 798.834023] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 798.834023] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c127919b-58ce-47d8-95da-9af708bc573f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.852218] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 798.852218] env[60764]: value = "task-2204901" [ 798.852218] env[60764]: _type = "Task" [ 798.852218] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 798.860550] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204901, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 799.366835] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204901, 'name': CreateVM_Task, 'duration_secs': 0.300384} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 799.367111] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 799.367853] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 799.367853] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 799.368205] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 799.368504] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-18352331-86ad-442c-bfb8-c2feaf8d36fc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 799.373589] env[60764]: DEBUG oslo_vmware.api [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Waiting for the task: (returnval){ [ 799.373589] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52c1b5e3-ea46-6548-e591-cd6534843fab" [ 799.373589] env[60764]: _type = "Task" [ 799.373589] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 799.382075] env[60764]: DEBUG oslo_vmware.api [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52c1b5e3-ea46-6548-e591-cd6534843fab, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 799.886271] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 799.886686] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 799.887054] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 800.187882] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2538a5b9-d16f-4988-a8ca-38e30f5cd1e0 tempest-ServerRescueNegativeTestJSON-251426708 tempest-ServerRescueNegativeTestJSON-251426708-project-member] Acquiring lock "5bd8caaf-f61b-4df2-abd0-3da5259ae829" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 800.188622] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2538a5b9-d16f-4988-a8ca-38e30f5cd1e0 tempest-ServerRescueNegativeTestJSON-251426708 tempest-ServerRescueNegativeTestJSON-251426708-project-member] Lock "5bd8caaf-f61b-4df2-abd0-3da5259ae829" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 801.013053] env[60764]: DEBUG nova.compute.manager [req-4cc67c18-25bb-441a-9405-c4a3c67339e7 req-ce2ce3d2-f9c0-42bb-9e2b-30c6240cf67b service nova] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Received event network-changed-1c79fd0c-a2f5-4377-bc4d-7307a073163e {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 801.013295] env[60764]: DEBUG nova.compute.manager [req-4cc67c18-25bb-441a-9405-c4a3c67339e7 req-ce2ce3d2-f9c0-42bb-9e2b-30c6240cf67b service nova] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Refreshing instance network info cache due to event network-changed-1c79fd0c-a2f5-4377-bc4d-7307a073163e. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 801.013523] env[60764]: DEBUG oslo_concurrency.lockutils [req-4cc67c18-25bb-441a-9405-c4a3c67339e7 req-ce2ce3d2-f9c0-42bb-9e2b-30c6240cf67b service nova] Acquiring lock "refresh_cache-6af62a04-4a13-46c7-a0b2-28768c789f23" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 801.013670] env[60764]: DEBUG oslo_concurrency.lockutils [req-4cc67c18-25bb-441a-9405-c4a3c67339e7 req-ce2ce3d2-f9c0-42bb-9e2b-30c6240cf67b service nova] Acquired lock "refresh_cache-6af62a04-4a13-46c7-a0b2-28768c789f23" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 801.013826] env[60764]: DEBUG nova.network.neutron [req-4cc67c18-25bb-441a-9405-c4a3c67339e7 req-ce2ce3d2-f9c0-42bb-9e2b-30c6240cf67b service nova] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Refreshing network info cache for port 1c79fd0c-a2f5-4377-bc4d-7307a073163e {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 801.050140] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e6da2882-edb7-438f-bc6b-a2a357fc7855 tempest-ServersNegativeTestJSON-1968257794 tempest-ServersNegativeTestJSON-1968257794-project-member] Acquiring lock "9d4328ec-dab2-41c8-88da-82df6f2ae17f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 801.050407] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e6da2882-edb7-438f-bc6b-a2a357fc7855 tempest-ServersNegativeTestJSON-1968257794 tempest-ServersNegativeTestJSON-1968257794-project-member] Lock "9d4328ec-dab2-41c8-88da-82df6f2ae17f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 801.181662] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1da3d1bd-c45c-457f-bd84-36780e49f390 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Acquiring lock "12c9b68c-740c-4555-915d-b23c8b5f0473" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 801.182679] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1da3d1bd-c45c-457f-bd84-36780e49f390 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "12c9b68c-740c-4555-915d-b23c8b5f0473" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 801.400755] env[60764]: DEBUG nova.network.neutron [req-4cc67c18-25bb-441a-9405-c4a3c67339e7 req-ce2ce3d2-f9c0-42bb-9e2b-30c6240cf67b service nova] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Updated VIF entry in instance network info cache for port 1c79fd0c-a2f5-4377-bc4d-7307a073163e. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 801.402598] env[60764]: DEBUG nova.network.neutron [req-4cc67c18-25bb-441a-9405-c4a3c67339e7 req-ce2ce3d2-f9c0-42bb-9e2b-30c6240cf67b service nova] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Updating instance_info_cache with network_info: [{"id": "1c79fd0c-a2f5-4377-bc4d-7307a073163e", "address": "fa:16:3e:93:c7:18", "network": {"id": "ad60a611-6fd0-415d-9a3e-f0e92576ce30", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-2073397685-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fc214d2435e94f6cbf7ca16207861ab3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e2153f70-3d14-42ab-8bb3-be78296dd3b8", "external-id": "nsx-vlan-transportzone-532", "segmentation_id": 532, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1c79fd0c-a2", "ovs_interfaceid": "1c79fd0c-a2f5-4377-bc4d-7307a073163e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 801.412934] env[60764]: DEBUG oslo_concurrency.lockutils [req-4cc67c18-25bb-441a-9405-c4a3c67339e7 req-ce2ce3d2-f9c0-42bb-9e2b-30c6240cf67b service nova] Releasing lock "refresh_cache-6af62a04-4a13-46c7-a0b2-28768c789f23" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 802.069589] env[60764]: DEBUG oslo_concurrency.lockutils [None req-34dc3448-b626-4cd5-a127-0c76cccc6a52 tempest-ServerRescueNegativeTestJSON-251426708 tempest-ServerRescueNegativeTestJSON-251426708-project-member] Acquiring lock "d7a99b7b-faa7-4904-9841-0ea582af96ca" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 802.069904] env[60764]: DEBUG oslo_concurrency.lockutils [None req-34dc3448-b626-4cd5-a127-0c76cccc6a52 tempest-ServerRescueNegativeTestJSON-251426708 tempest-ServerRescueNegativeTestJSON-251426708-project-member] Lock "d7a99b7b-faa7-4904-9841-0ea582af96ca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 806.740481] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d6b85a5-02f3-4f91-94d9-606618de7fd0 tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] Acquiring lock "8cd907ae-697b-4fe3-86d4-e2e9f38ae424" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 806.740750] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d6b85a5-02f3-4f91-94d9-606618de7fd0 tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] Lock "8cd907ae-697b-4fe3-86d4-e2e9f38ae424" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 806.767389] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d6b85a5-02f3-4f91-94d9-606618de7fd0 tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] Acquiring lock "67b9f11f-bb3f-43d6-bed7-54cbb2fd5e82" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 806.767526] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d6b85a5-02f3-4f91-94d9-606618de7fd0 tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] Lock "67b9f11f-bb3f-43d6-bed7-54cbb2fd5e82" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 810.236114] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0a8e08a0-9b08-4555-a386-495f6c8485d4 tempest-InstanceActionsTestJSON-1306287903 tempest-InstanceActionsTestJSON-1306287903-project-member] Acquiring lock "3b0f00e3-207a-416b-b971-c687df536a85" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 810.236114] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0a8e08a0-9b08-4555-a386-495f6c8485d4 tempest-InstanceActionsTestJSON-1306287903 tempest-InstanceActionsTestJSON-1306287903-project-member] Lock "3b0f00e3-207a-416b-b971-c687df536a85" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 810.883182] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1096b544-0c1f-421d-a938-00b228fb253e tempest-ServerMetadataNegativeTestJSON-533834650 tempest-ServerMetadataNegativeTestJSON-533834650-project-member] Acquiring lock "f926a47b-5252-41a9-9987-027858179887" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 810.883689] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1096b544-0c1f-421d-a938-00b228fb253e tempest-ServerMetadataNegativeTestJSON-533834650 tempest-ServerMetadataNegativeTestJSON-533834650-project-member] Lock "f926a47b-5252-41a9-9987-027858179887" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 835.629999] env[60764]: WARNING oslo_vmware.rw_handles [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 835.629999] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 835.629999] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 835.629999] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 835.629999] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 835.629999] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 835.629999] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 835.629999] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 835.629999] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 835.629999] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 835.629999] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 835.629999] env[60764]: ERROR oslo_vmware.rw_handles [ 835.630732] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/2c313b3e-bcba-4dd1-86b0-63a8392a3215/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 835.632135] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 835.632378] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Copying Virtual Disk [datastore2] vmware_temp/2c313b3e-bcba-4dd1-86b0-63a8392a3215/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/2c313b3e-bcba-4dd1-86b0-63a8392a3215/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 835.632660] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c1575526-aaa6-4286-9e59-a6533e5ed329 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 835.641043] env[60764]: DEBUG oslo_vmware.api [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Waiting for the task: (returnval){ [ 835.641043] env[60764]: value = "task-2204902" [ 835.641043] env[60764]: _type = "Task" [ 835.641043] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 835.649268] env[60764]: DEBUG oslo_vmware.api [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Task: {'id': task-2204902, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 836.151370] env[60764]: DEBUG oslo_vmware.exceptions [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 836.151658] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 836.152238] env[60764]: ERROR nova.compute.manager [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 836.152238] env[60764]: Faults: ['InvalidArgument'] [ 836.152238] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Traceback (most recent call last): [ 836.152238] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 836.152238] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] yield resources [ 836.152238] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 836.152238] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] self.driver.spawn(context, instance, image_meta, [ 836.152238] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 836.152238] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] self._vmops.spawn(context, instance, image_meta, injected_files, [ 836.152238] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 836.152238] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] self._fetch_image_if_missing(context, vi) [ 836.152238] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 836.152611] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] image_cache(vi, tmp_image_ds_loc) [ 836.152611] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 836.152611] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] vm_util.copy_virtual_disk( [ 836.152611] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 836.152611] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] session._wait_for_task(vmdk_copy_task) [ 836.152611] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 836.152611] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] return self.wait_for_task(task_ref) [ 836.152611] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 836.152611] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] return evt.wait() [ 836.152611] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 836.152611] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] result = hub.switch() [ 836.152611] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 836.152611] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] return self.greenlet.switch() [ 836.152960] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 836.152960] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] self.f(*self.args, **self.kw) [ 836.152960] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 836.152960] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] raise exceptions.translate_fault(task_info.error) [ 836.152960] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 836.152960] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Faults: ['InvalidArgument'] [ 836.152960] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] [ 836.152960] env[60764]: INFO nova.compute.manager [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Terminating instance [ 836.154110] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 836.154321] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 836.154558] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c568ec3b-3942-4578-87e9-c649d5b4a79f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.156832] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Acquiring lock "refresh_cache-8d32e9d3-43ff-47dd-a52b-102edbe3af11" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 836.156991] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Acquired lock "refresh_cache-8d32e9d3-43ff-47dd-a52b-102edbe3af11" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 836.157169] env[60764]: DEBUG nova.network.neutron [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 836.164348] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 836.164523] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 836.165743] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-899fe4a6-a195-4baa-a1f0-b8f7bec8d946 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.173614] env[60764]: DEBUG oslo_vmware.api [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Waiting for the task: (returnval){ [ 836.173614] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]523fee8a-0467-8755-7942-a33ad92daae6" [ 836.173614] env[60764]: _type = "Task" [ 836.173614] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 836.182930] env[60764]: DEBUG oslo_vmware.api [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]523fee8a-0467-8755-7942-a33ad92daae6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 836.235595] env[60764]: DEBUG nova.network.neutron [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 836.347586] env[60764]: DEBUG nova.network.neutron [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 836.356728] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Releasing lock "refresh_cache-8d32e9d3-43ff-47dd-a52b-102edbe3af11" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 836.357153] env[60764]: DEBUG nova.compute.manager [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 836.358917] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 836.358917] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-130eafc2-80b9-4f72-8c5a-73e491117455 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.366702] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 836.366935] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-76737099-11e3-4588-95ca-1fa8f8de5812 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.405620] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 836.405861] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 836.406066] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Deleting the datastore file [datastore2] 8d32e9d3-43ff-47dd-a52b-102edbe3af11 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 836.406333] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-84ac5645-3e1b-44e4-af00-3346f726c320 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.412556] env[60764]: DEBUG oslo_vmware.api [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Waiting for the task: (returnval){ [ 836.412556] env[60764]: value = "task-2204904" [ 836.412556] env[60764]: _type = "Task" [ 836.412556] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 836.420439] env[60764]: DEBUG oslo_vmware.api [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Task: {'id': task-2204904, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 836.683842] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 836.684180] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Creating directory with path [datastore2] vmware_temp/a2e2f1fa-58e5-41cb-935e-e2a10a0bf949/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 836.684351] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fbe44a20-2b63-4c14-af36-13ada9249d6f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.695998] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Created directory with path [datastore2] vmware_temp/a2e2f1fa-58e5-41cb-935e-e2a10a0bf949/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 836.696289] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Fetch image to [datastore2] vmware_temp/a2e2f1fa-58e5-41cb-935e-e2a10a0bf949/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 836.696403] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/a2e2f1fa-58e5-41cb-935e-e2a10a0bf949/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 836.697176] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1057288f-f502-4c7d-be94-ab2a7452e471 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.704983] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32a6341c-c4d1-4b28-896c-5b76950d23e1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.714589] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cb78961-69fb-43de-b9d5-16a4a72a1b4f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.745112] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4534d162-e604-4683-9932-bac030069fe5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.750761] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e5dd9240-1906-4f85-940b-25c94cf19af1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 836.770748] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 836.820026] env[60764]: DEBUG oslo_vmware.rw_handles [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a2e2f1fa-58e5-41cb-935e-e2a10a0bf949/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 836.879635] env[60764]: DEBUG oslo_vmware.rw_handles [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 836.879819] env[60764]: DEBUG oslo_vmware.rw_handles [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a2e2f1fa-58e5-41cb-935e-e2a10a0bf949/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 836.922472] env[60764]: DEBUG oslo_vmware.api [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Task: {'id': task-2204904, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.033248} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 836.922729] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 836.922908] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 836.923095] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 836.923271] env[60764]: INFO nova.compute.manager [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Took 0.57 seconds to destroy the instance on the hypervisor. [ 836.923512] env[60764]: DEBUG oslo.service.loopingcall [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 836.923759] env[60764]: DEBUG nova.compute.manager [-] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Skipping network deallocation for instance since networking was not requested. {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 836.925901] env[60764]: DEBUG nova.compute.claims [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 836.926091] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 836.926314] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 837.292371] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0f0ef5d-0942-46ab-b883-c1c5b7a1e6bd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.300196] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34503fba-2264-4398-872a-41ac854469ff {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.329594] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1653a2c-8735-4990-97c7-19b8cc9b37db {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.336975] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43998686-a6f7-40c9-8e69-2d4ecf8e56b8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.350029] env[60764]: DEBUG nova.compute.provider_tree [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 837.359185] env[60764]: DEBUG nova.scheduler.client.report [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 837.375829] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.449s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 837.377075] env[60764]: ERROR nova.compute.manager [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 837.377075] env[60764]: Faults: ['InvalidArgument'] [ 837.377075] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Traceback (most recent call last): [ 837.377075] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 837.377075] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] self.driver.spawn(context, instance, image_meta, [ 837.377075] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 837.377075] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] self._vmops.spawn(context, instance, image_meta, injected_files, [ 837.377075] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 837.377075] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] self._fetch_image_if_missing(context, vi) [ 837.377075] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 837.377075] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] image_cache(vi, tmp_image_ds_loc) [ 837.377075] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 837.377480] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] vm_util.copy_virtual_disk( [ 837.377480] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 837.377480] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] session._wait_for_task(vmdk_copy_task) [ 837.377480] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 837.377480] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] return self.wait_for_task(task_ref) [ 837.377480] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 837.377480] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] return evt.wait() [ 837.377480] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 837.377480] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] result = hub.switch() [ 837.377480] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 837.377480] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] return self.greenlet.switch() [ 837.377480] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 837.377480] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] self.f(*self.args, **self.kw) [ 837.377968] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 837.377968] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] raise exceptions.translate_fault(task_info.error) [ 837.377968] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 837.377968] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Faults: ['InvalidArgument'] [ 837.377968] env[60764]: ERROR nova.compute.manager [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] [ 837.377968] env[60764]: DEBUG nova.compute.utils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 837.378847] env[60764]: DEBUG nova.compute.manager [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Build of instance 8d32e9d3-43ff-47dd-a52b-102edbe3af11 was re-scheduled: A specified parameter was not correct: fileType [ 837.378847] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 837.379250] env[60764]: DEBUG nova.compute.manager [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 837.379476] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Acquiring lock "refresh_cache-8d32e9d3-43ff-47dd-a52b-102edbe3af11" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 837.379621] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Acquired lock "refresh_cache-8d32e9d3-43ff-47dd-a52b-102edbe3af11" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 837.379768] env[60764]: DEBUG nova.network.neutron [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 837.436976] env[60764]: DEBUG nova.network.neutron [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 837.515100] env[60764]: DEBUG nova.network.neutron [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 837.524546] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Releasing lock "refresh_cache-8d32e9d3-43ff-47dd-a52b-102edbe3af11" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 837.524762] env[60764]: DEBUG nova.compute.manager [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 837.524938] env[60764]: DEBUG nova.compute.manager [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Skipping network deallocation for instance since networking was not requested. {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 837.613561] env[60764]: INFO nova.scheduler.client.report [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Deleted allocations for instance 8d32e9d3-43ff-47dd-a52b-102edbe3af11 [ 837.633458] env[60764]: DEBUG oslo_concurrency.lockutils [None req-5746371a-3ffa-43db-83f2-febc133e912a tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Lock "8d32e9d3-43ff-47dd-a52b-102edbe3af11" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 288.769s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 837.634625] env[60764]: DEBUG oslo_concurrency.lockutils [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Lock "8d32e9d3-43ff-47dd-a52b-102edbe3af11" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 87.070s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 837.635479] env[60764]: DEBUG oslo_concurrency.lockutils [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Acquiring lock "8d32e9d3-43ff-47dd-a52b-102edbe3af11-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 837.635479] env[60764]: DEBUG oslo_concurrency.lockutils [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Lock "8d32e9d3-43ff-47dd-a52b-102edbe3af11-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 837.635479] env[60764]: DEBUG oslo_concurrency.lockutils [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Lock "8d32e9d3-43ff-47dd-a52b-102edbe3af11-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 837.637341] env[60764]: INFO nova.compute.manager [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Terminating instance [ 837.638973] env[60764]: DEBUG oslo_concurrency.lockutils [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Acquiring lock "refresh_cache-8d32e9d3-43ff-47dd-a52b-102edbe3af11" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 837.639142] env[60764]: DEBUG oslo_concurrency.lockutils [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Acquired lock "refresh_cache-8d32e9d3-43ff-47dd-a52b-102edbe3af11" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 837.639308] env[60764]: DEBUG nova.network.neutron [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 837.648946] env[60764]: DEBUG nova.compute.manager [None req-606aca73-fcb7-4124-b419-7f26859b5b5d tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: a48f97b9-8720-4ad2-82a3-bae679b6b2ef] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 837.663359] env[60764]: DEBUG nova.network.neutron [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 837.675846] env[60764]: DEBUG nova.compute.manager [None req-606aca73-fcb7-4124-b419-7f26859b5b5d tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: a48f97b9-8720-4ad2-82a3-bae679b6b2ef] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 837.697301] env[60764]: DEBUG oslo_concurrency.lockutils [None req-606aca73-fcb7-4124-b419-7f26859b5b5d tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "a48f97b9-8720-4ad2-82a3-bae679b6b2ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 235.837s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 837.710853] env[60764]: DEBUG nova.compute.manager [None req-90b2c84b-051b-47c5-b3b5-b2639dfcb03a tempest-ServerShowV254Test-1358377146 tempest-ServerShowV254Test-1358377146-project-member] [instance: 8eb675b6-f73d-47a0-af75-63b5a8800e69] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 837.737294] env[60764]: DEBUG nova.compute.manager [None req-90b2c84b-051b-47c5-b3b5-b2639dfcb03a tempest-ServerShowV254Test-1358377146 tempest-ServerShowV254Test-1358377146-project-member] [instance: 8eb675b6-f73d-47a0-af75-63b5a8800e69] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 837.739622] env[60764]: DEBUG nova.network.neutron [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 837.748281] env[60764]: DEBUG oslo_concurrency.lockutils [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Releasing lock "refresh_cache-8d32e9d3-43ff-47dd-a52b-102edbe3af11" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 837.748752] env[60764]: DEBUG nova.compute.manager [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 837.749134] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 837.751704] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-05f3ab75-bc1b-4da7-89c5-71e41b9203bf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.760631] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2937e7ba-febc-4d65-aff9-d295f711bf2a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 837.771895] env[60764]: DEBUG oslo_concurrency.lockutils [None req-90b2c84b-051b-47c5-b3b5-b2639dfcb03a tempest-ServerShowV254Test-1358377146 tempest-ServerShowV254Test-1358377146-project-member] Lock "8eb675b6-f73d-47a0-af75-63b5a8800e69" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 233.867s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 837.790377] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8d32e9d3-43ff-47dd-a52b-102edbe3af11 could not be found. [ 837.790585] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 837.790765] env[60764]: INFO nova.compute.manager [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Took 0.04 seconds to destroy the instance on the hypervisor. [ 837.791054] env[60764]: DEBUG oslo.service.loopingcall [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 837.791784] env[60764]: DEBUG nova.compute.manager [-] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 837.791911] env[60764]: DEBUG nova.network.neutron [-] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 837.793651] env[60764]: DEBUG nova.compute.manager [None req-b336dd07-f201-4870-bd01-49938624ba7b tempest-ServerActionsV293TestJSON-292591111 tempest-ServerActionsV293TestJSON-292591111-project-member] [instance: fdd51c42-e0c1-4ad1-aee9-f091cfec0030] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 837.810723] env[60764]: DEBUG nova.network.neutron [-] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 837.821648] env[60764]: DEBUG nova.compute.manager [None req-b336dd07-f201-4870-bd01-49938624ba7b tempest-ServerActionsV293TestJSON-292591111 tempest-ServerActionsV293TestJSON-292591111-project-member] [instance: fdd51c42-e0c1-4ad1-aee9-f091cfec0030] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 837.822953] env[60764]: DEBUG nova.network.neutron [-] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 837.830770] env[60764]: INFO nova.compute.manager [-] [instance: 8d32e9d3-43ff-47dd-a52b-102edbe3af11] Took 0.04 seconds to deallocate network for instance. [ 837.843266] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b336dd07-f201-4870-bd01-49938624ba7b tempest-ServerActionsV293TestJSON-292591111 tempest-ServerActionsV293TestJSON-292591111-project-member] Lock "fdd51c42-e0c1-4ad1-aee9-f091cfec0030" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.149s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 837.852185] env[60764]: DEBUG nova.compute.manager [None req-7c8d3f17-ec44-4196-8a63-79896cae974d tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] [instance: 207825f2-e4aa-4747-bd79-384773b3d516] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 837.884899] env[60764]: DEBUG nova.compute.manager [None req-7c8d3f17-ec44-4196-8a63-79896cae974d tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] [instance: 207825f2-e4aa-4747-bd79-384773b3d516] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 837.909460] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7c8d3f17-ec44-4196-8a63-79896cae974d tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] Lock "207825f2-e4aa-4747-bd79-384773b3d516" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.803s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 837.922898] env[60764]: DEBUG nova.compute.manager [None req-7c8d3f17-ec44-4196-8a63-79896cae974d tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] [instance: 57919d5b-769c-4752-873c-d78bb61f5800] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 837.933273] env[60764]: DEBUG oslo_concurrency.lockutils [None req-00639efa-494b-4b13-b839-a53e7c12488d tempest-ServersAdmin275Test-1188688059 tempest-ServersAdmin275Test-1188688059-project-member] Lock "8d32e9d3-43ff-47dd-a52b-102edbe3af11" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.298s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 837.953055] env[60764]: DEBUG nova.compute.manager [None req-7c8d3f17-ec44-4196-8a63-79896cae974d tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] [instance: 57919d5b-769c-4752-873c-d78bb61f5800] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 837.976790] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7c8d3f17-ec44-4196-8a63-79896cae974d tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] Lock "57919d5b-769c-4752-873c-d78bb61f5800" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.836s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 837.992535] env[60764]: DEBUG nova.compute.manager [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 838.058870] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 838.059287] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 838.061898] env[60764]: INFO nova.compute.claims [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 838.556117] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f6da5cd-9e8d-437e-87cd-56f223181c93 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.564307] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3edb2f8d-61ad-42a0-b106-905f893b7d36 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.595064] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c56cecf-20bc-46f1-98ce-d46e57491999 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.602015] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6546861c-cdff-4e64-8b37-907d45b6a94c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.615614] env[60764]: DEBUG nova.compute.provider_tree [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 838.627993] env[60764]: DEBUG nova.scheduler.client.report [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 838.642664] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.583s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 838.643057] env[60764]: DEBUG nova.compute.manager [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 838.686034] env[60764]: DEBUG nova.compute.utils [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 838.686471] env[60764]: DEBUG nova.compute.manager [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 838.686772] env[60764]: DEBUG nova.network.neutron [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 838.700018] env[60764]: DEBUG nova.compute.manager [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 838.743394] env[60764]: INFO nova.virt.block_device [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Booting with volume 387072a8-7c57-43cc-8a5f-d4eb1eff6cd9 at /dev/sda [ 838.788417] env[60764]: DEBUG nova.policy [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '10c741e621b242ab8f79efc5e9ea76dd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cf1b25f7bbec4186bff6532c2abfab84', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 838.799830] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-84bd702e-a1f6-44ad-8fc8-94ce725540ef {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.812329] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6df88254-4518-4455-a135-61ec33500342 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.858386] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a2f9209c-1f92-4177-b5e7-c11f031d4c3b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.870769] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-671140d2-c955-42f4-af28-b7ce8566eb9a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.900831] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76c348e8-afc6-4687-8026-c5ef0e0f1dcd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.907526] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4ed5aef-449a-428d-a506-76c1158e0948 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 838.922601] env[60764]: DEBUG nova.virt.block_device [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Updating existing volume attachment record: 448163e6-42d0-4a73-a7e9-767385790651 {{(pid=60764) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 839.207686] env[60764]: DEBUG nova.compute.manager [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 839.208305] env[60764]: DEBUG nova.virt.hardware [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 839.208528] env[60764]: DEBUG nova.virt.hardware [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 839.208680] env[60764]: DEBUG nova.virt.hardware [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 839.208860] env[60764]: DEBUG nova.virt.hardware [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 839.209211] env[60764]: DEBUG nova.virt.hardware [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 839.209279] env[60764]: DEBUG nova.virt.hardware [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 839.209578] env[60764]: DEBUG nova.virt.hardware [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 839.209681] env[60764]: DEBUG nova.virt.hardware [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 839.210014] env[60764]: DEBUG nova.virt.hardware [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 839.210176] env[60764]: DEBUG nova.virt.hardware [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 839.210352] env[60764]: DEBUG nova.virt.hardware [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 839.211627] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f9b0be9-ca38-4e41-99bf-c11b320572b1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 839.220996] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8723867-0b57-4ab3-a51e-38268e49589e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 839.416141] env[60764]: DEBUG nova.network.neutron [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Successfully created port: a1f88f2c-4533-4c02-ba84-252ae2327711 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 840.414404] env[60764]: DEBUG nova.network.neutron [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Successfully updated port: a1f88f2c-4533-4c02-ba84-252ae2327711 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 840.416879] env[60764]: DEBUG nova.compute.manager [req-4d2bb155-5e8c-4963-b22e-b562f4c3b05d req-aaef06ab-45c3-41ee-b020-5bb21431173f service nova] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Received event network-vif-plugged-a1f88f2c-4533-4c02-ba84-252ae2327711 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 840.417037] env[60764]: DEBUG oslo_concurrency.lockutils [req-4d2bb155-5e8c-4963-b22e-b562f4c3b05d req-aaef06ab-45c3-41ee-b020-5bb21431173f service nova] Acquiring lock "c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 840.417241] env[60764]: DEBUG oslo_concurrency.lockutils [req-4d2bb155-5e8c-4963-b22e-b562f4c3b05d req-aaef06ab-45c3-41ee-b020-5bb21431173f service nova] Lock "c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 840.417322] env[60764]: DEBUG oslo_concurrency.lockutils [req-4d2bb155-5e8c-4963-b22e-b562f4c3b05d req-aaef06ab-45c3-41ee-b020-5bb21431173f service nova] Lock "c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 840.417504] env[60764]: DEBUG nova.compute.manager [req-4d2bb155-5e8c-4963-b22e-b562f4c3b05d req-aaef06ab-45c3-41ee-b020-5bb21431173f service nova] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] No waiting events found dispatching network-vif-plugged-a1f88f2c-4533-4c02-ba84-252ae2327711 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 840.417663] env[60764]: WARNING nova.compute.manager [req-4d2bb155-5e8c-4963-b22e-b562f4c3b05d req-aaef06ab-45c3-41ee-b020-5bb21431173f service nova] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Received unexpected event network-vif-plugged-a1f88f2c-4533-4c02-ba84-252ae2327711 for instance with vm_state building and task_state spawning. [ 840.433230] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Acquiring lock "refresh_cache-c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 840.433230] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Acquired lock "refresh_cache-c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 840.433337] env[60764]: DEBUG nova.network.neutron [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 840.500695] env[60764]: DEBUG nova.network.neutron [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 840.769116] env[60764]: DEBUG nova.network.neutron [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Updating instance_info_cache with network_info: [{"id": "a1f88f2c-4533-4c02-ba84-252ae2327711", "address": "fa:16:3e:34:f9:95", "network": {"id": "eb4fb2ce-ddc7-4a2d-9c3c-4d1273be8c3f", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-947629294-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cf1b25f7bbec4186bff6532c2abfab84", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd7d0d95-6848-4e69-ac21-75f8db82a3b5", "external-id": "nsx-vlan-transportzone-272", "segmentation_id": 272, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa1f88f2c-45", "ovs_interfaceid": "a1f88f2c-4533-4c02-ba84-252ae2327711", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 840.782723] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Releasing lock "refresh_cache-c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 840.782723] env[60764]: DEBUG nova.compute.manager [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Instance network_info: |[{"id": "a1f88f2c-4533-4c02-ba84-252ae2327711", "address": "fa:16:3e:34:f9:95", "network": {"id": "eb4fb2ce-ddc7-4a2d-9c3c-4d1273be8c3f", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-947629294-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cf1b25f7bbec4186bff6532c2abfab84", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd7d0d95-6848-4e69-ac21-75f8db82a3b5", "external-id": "nsx-vlan-transportzone-272", "segmentation_id": 272, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa1f88f2c-45", "ovs_interfaceid": "a1f88f2c-4533-4c02-ba84-252ae2327711", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 840.782841] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:34:f9:95', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dd7d0d95-6848-4e69-ac21-75f8db82a3b5', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a1f88f2c-4533-4c02-ba84-252ae2327711', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 840.790143] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Creating folder: Project (cf1b25f7bbec4186bff6532c2abfab84). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 840.790675] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1d5ee4a8-540d-48e3-a7ee-3cf3355d1596 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 840.804337] env[60764]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 840.804526] env[60764]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=60764) _invoke_api /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:337}} [ 840.804889] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Folder already exists: Project (cf1b25f7bbec4186bff6532c2abfab84). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 840.805135] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Creating folder: Instances. Parent ref: group-v449670. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 840.805397] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-56d762d6-ac15-42b7-b047-ceb5faa0916d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 840.814364] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Created folder: Instances in parent group-v449670. [ 840.814604] env[60764]: DEBUG oslo.service.loopingcall [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 840.814803] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 840.814998] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-14cd8179-5d01-48be-a77e-290cbca6449d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 840.835011] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 840.835011] env[60764]: value = "task-2204907" [ 840.835011] env[60764]: _type = "Task" [ 840.835011] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 840.843019] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204907, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 841.346027] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204907, 'name': CreateVM_Task, 'duration_secs': 0.280517} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 841.346214] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 841.346846] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'disk_bus': None, 'mount_device': '/dev/sda', 'delete_on_termination': True, 'device_type': None, 'attachment_id': '448163e6-42d0-4a73-a7e9-767385790651', 'boot_index': 0, 'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-449673', 'volume_id': '387072a8-7c57-43cc-8a5f-d4eb1eff6cd9', 'name': 'volume-387072a8-7c57-43cc-8a5f-d4eb1eff6cd9', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e', 'attached_at': '', 'detached_at': '', 'volume_id': '387072a8-7c57-43cc-8a5f-d4eb1eff6cd9', 'serial': '387072a8-7c57-43cc-8a5f-d4eb1eff6cd9'}, 'guest_format': None, 'volume_type': None}], 'swap': None} {{(pid=60764) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 841.347084] env[60764]: DEBUG nova.virt.vmwareapi.volumeops [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Root volume attach. Driver type: vmdk {{(pid=60764) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 841.347837] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d43bb983-6b28-48f3-8ebc-87205089e4fc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.357099] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d66f1527-4ba5-4b43-a890-5a9f260637c2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.363138] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-027ed515-d9a5-4b51-af53-82d02232390d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.367900] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-4fecd0e3-687d-4a70-bf60-2eb9cc48b5e5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.381604] env[60764]: DEBUG oslo_vmware.api [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Waiting for the task: (returnval){ [ 841.381604] env[60764]: value = "task-2204908" [ 841.381604] env[60764]: _type = "Task" [ 841.381604] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 841.387845] env[60764]: DEBUG oslo_vmware.api [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Task: {'id': task-2204908, 'name': RelocateVM_Task} progress is 5%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 841.891375] env[60764]: DEBUG oslo_vmware.api [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Task: {'id': task-2204908, 'name': RelocateVM_Task, 'duration_secs': 0.350979} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 841.891375] env[60764]: DEBUG nova.virt.vmwareapi.volumeops [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Volume attach. Driver type: vmdk {{(pid=60764) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 841.892554] env[60764]: DEBUG nova.virt.vmwareapi.volumeops [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-449673', 'volume_id': '387072a8-7c57-43cc-8a5f-d4eb1eff6cd9', 'name': 'volume-387072a8-7c57-43cc-8a5f-d4eb1eff6cd9', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e', 'attached_at': '', 'detached_at': '', 'volume_id': '387072a8-7c57-43cc-8a5f-d4eb1eff6cd9', 'serial': '387072a8-7c57-43cc-8a5f-d4eb1eff6cd9'} {{(pid=60764) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 841.893136] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebd14972-f91b-4ebf-afa5-e315d5eafe10 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.913198] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7de7a6fa-87ad-41a8-b7c3-c271cbe35664 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.939210] env[60764]: DEBUG nova.virt.vmwareapi.volumeops [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Reconfiguring VM instance instance-00000021 to attach disk [datastore2] volume-387072a8-7c57-43cc-8a5f-d4eb1eff6cd9/volume-387072a8-7c57-43cc-8a5f-d4eb1eff6cd9.vmdk or device None with type thin {{(pid=60764) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 841.939715] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-373c1c49-b44d-4bfb-bec8-1034e242eecf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.961748] env[60764]: DEBUG oslo_vmware.api [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Waiting for the task: (returnval){ [ 841.961748] env[60764]: value = "task-2204909" [ 841.961748] env[60764]: _type = "Task" [ 841.961748] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 841.971448] env[60764]: DEBUG oslo_vmware.api [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Task: {'id': task-2204909, 'name': ReconfigVM_Task} progress is 6%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 842.472012] env[60764]: DEBUG oslo_vmware.api [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Task: {'id': task-2204909, 'name': ReconfigVM_Task, 'duration_secs': 0.255589} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 842.472397] env[60764]: DEBUG nova.virt.vmwareapi.volumeops [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Reconfigured VM instance instance-00000021 to attach disk [datastore2] volume-387072a8-7c57-43cc-8a5f-d4eb1eff6cd9/volume-387072a8-7c57-43cc-8a5f-d4eb1eff6cd9.vmdk or device None with type thin {{(pid=60764) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 842.478725] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-ef4a1c0e-b394-4fde-bc72-af755f852eff {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 842.500892] env[60764]: DEBUG oslo_vmware.api [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Waiting for the task: (returnval){ [ 842.500892] env[60764]: value = "task-2204910" [ 842.500892] env[60764]: _type = "Task" [ 842.500892] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 842.509126] env[60764]: DEBUG oslo_vmware.api [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Task: {'id': task-2204910, 'name': ReconfigVM_Task} progress is 6%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 842.703756] env[60764]: DEBUG nova.compute.manager [req-f19007d1-86f9-4bbe-b5b7-94aa055a3dbd req-fde8112a-9989-4791-b5e7-7dc0c8ad0539 service nova] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Received event network-changed-a1f88f2c-4533-4c02-ba84-252ae2327711 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 842.703956] env[60764]: DEBUG nova.compute.manager [req-f19007d1-86f9-4bbe-b5b7-94aa055a3dbd req-fde8112a-9989-4791-b5e7-7dc0c8ad0539 service nova] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Refreshing instance network info cache due to event network-changed-a1f88f2c-4533-4c02-ba84-252ae2327711. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 842.704188] env[60764]: DEBUG oslo_concurrency.lockutils [req-f19007d1-86f9-4bbe-b5b7-94aa055a3dbd req-fde8112a-9989-4791-b5e7-7dc0c8ad0539 service nova] Acquiring lock "refresh_cache-c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 842.704339] env[60764]: DEBUG oslo_concurrency.lockutils [req-f19007d1-86f9-4bbe-b5b7-94aa055a3dbd req-fde8112a-9989-4791-b5e7-7dc0c8ad0539 service nova] Acquired lock "refresh_cache-c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 842.704507] env[60764]: DEBUG nova.network.neutron [req-f19007d1-86f9-4bbe-b5b7-94aa055a3dbd req-fde8112a-9989-4791-b5e7-7dc0c8ad0539 service nova] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Refreshing network info cache for port a1f88f2c-4533-4c02-ba84-252ae2327711 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 843.011091] env[60764]: DEBUG oslo_vmware.api [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Task: {'id': task-2204910, 'name': ReconfigVM_Task, 'duration_secs': 0.117326} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 843.011594] env[60764]: DEBUG nova.virt.vmwareapi.volumeops [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-449673', 'volume_id': '387072a8-7c57-43cc-8a5f-d4eb1eff6cd9', 'name': 'volume-387072a8-7c57-43cc-8a5f-d4eb1eff6cd9', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e', 'attached_at': '', 'detached_at': '', 'volume_id': '387072a8-7c57-43cc-8a5f-d4eb1eff6cd9', 'serial': '387072a8-7c57-43cc-8a5f-d4eb1eff6cd9'} {{(pid=60764) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 843.012096] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-c04c0345-8dd3-45ba-b436-7173f9d26c92 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 843.018684] env[60764]: DEBUG oslo_vmware.api [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Waiting for the task: (returnval){ [ 843.018684] env[60764]: value = "task-2204911" [ 843.018684] env[60764]: _type = "Task" [ 843.018684] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 843.028027] env[60764]: DEBUG oslo_vmware.api [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Task: {'id': task-2204911, 'name': Rename_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 843.302137] env[60764]: DEBUG nova.network.neutron [req-f19007d1-86f9-4bbe-b5b7-94aa055a3dbd req-fde8112a-9989-4791-b5e7-7dc0c8ad0539 service nova] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Updated VIF entry in instance network info cache for port a1f88f2c-4533-4c02-ba84-252ae2327711. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 843.302504] env[60764]: DEBUG nova.network.neutron [req-f19007d1-86f9-4bbe-b5b7-94aa055a3dbd req-fde8112a-9989-4791-b5e7-7dc0c8ad0539 service nova] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Updating instance_info_cache with network_info: [{"id": "a1f88f2c-4533-4c02-ba84-252ae2327711", "address": "fa:16:3e:34:f9:95", "network": {"id": "eb4fb2ce-ddc7-4a2d-9c3c-4d1273be8c3f", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-947629294-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cf1b25f7bbec4186bff6532c2abfab84", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd7d0d95-6848-4e69-ac21-75f8db82a3b5", "external-id": "nsx-vlan-transportzone-272", "segmentation_id": 272, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa1f88f2c-45", "ovs_interfaceid": "a1f88f2c-4533-4c02-ba84-252ae2327711", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 843.313148] env[60764]: DEBUG oslo_concurrency.lockutils [req-f19007d1-86f9-4bbe-b5b7-94aa055a3dbd req-fde8112a-9989-4791-b5e7-7dc0c8ad0539 service nova] Releasing lock "refresh_cache-c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 843.531157] env[60764]: DEBUG oslo_vmware.api [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Task: {'id': task-2204911, 'name': Rename_Task, 'duration_secs': 0.13287} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 843.531821] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Powering on the VM {{(pid=60764) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 843.532238] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-6de6dc2e-5ea3-4f45-b52b-a3e40adc3338 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 843.543017] env[60764]: DEBUG oslo_vmware.api [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Waiting for the task: (returnval){ [ 843.543017] env[60764]: value = "task-2204912" [ 843.543017] env[60764]: _type = "Task" [ 843.543017] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 843.551095] env[60764]: DEBUG oslo_vmware.api [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Task: {'id': task-2204912, 'name': PowerOnVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 844.057070] env[60764]: DEBUG oslo_vmware.api [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Task: {'id': task-2204912, 'name': PowerOnVM_Task, 'duration_secs': 0.476676} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 844.057615] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Powered on the VM {{(pid=60764) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 844.057615] env[60764]: INFO nova.compute.manager [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Took 4.85 seconds to spawn the instance on the hypervisor. [ 844.057886] env[60764]: DEBUG nova.compute.manager [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Checking state {{(pid=60764) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} [ 844.058668] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dea764f-ee30-43d3-aa72-1000d5d4838d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 844.124838] env[60764]: INFO nova.compute.manager [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Took 6.08 seconds to build instance. [ 844.142622] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e58d47e6-a3ec-42b7-82be-4b0d87003295 tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Lock "c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 174.783s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 844.159374] env[60764]: DEBUG nova.compute.manager [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 844.222146] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 844.222399] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 844.224011] env[60764]: INFO nova.compute.claims [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 844.742140] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1ac5e1b-b25d-4ea3-848d-22d89f084a9b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 844.750874] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a43bdc98-3659-4e27-99de-6fda6585e606 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 844.792432] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7658128-d253-4f54-95af-0c53bea00f84 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 844.801241] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-715946de-6e51-4a64-8c46-6eba84c7afff {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 844.816637] env[60764]: DEBUG nova.compute.provider_tree [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 844.826140] env[60764]: DEBUG nova.scheduler.client.report [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 844.852200] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.630s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 844.852828] env[60764]: DEBUG nova.compute.manager [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 844.892654] env[60764]: DEBUG nova.compute.utils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 844.894108] env[60764]: DEBUG nova.compute.manager [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 844.894108] env[60764]: DEBUG nova.network.neutron [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 844.912375] env[60764]: DEBUG nova.compute.manager [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 844.984873] env[60764]: DEBUG nova.compute.manager [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 844.996298] env[60764]: DEBUG nova.policy [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9e6b8bd92bc1427a80dd64e2b69bd4f1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ef985aa716f94330ae11535fffddf02a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 845.024800] env[60764]: DEBUG nova.virt.hardware [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 845.025085] env[60764]: DEBUG nova.virt.hardware [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 845.025310] env[60764]: DEBUG nova.virt.hardware [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 845.025473] env[60764]: DEBUG nova.virt.hardware [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 845.025638] env[60764]: DEBUG nova.virt.hardware [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 845.025978] env[60764]: DEBUG nova.virt.hardware [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 845.026166] env[60764]: DEBUG nova.virt.hardware [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 845.026407] env[60764]: DEBUG nova.virt.hardware [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 845.026850] env[60764]: DEBUG nova.virt.hardware [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 845.026850] env[60764]: DEBUG nova.virt.hardware [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 845.027152] env[60764]: DEBUG nova.virt.hardware [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 845.028100] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07afa5cd-0b1d-43db-85fa-0ea968c32378 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 845.038285] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35276604-cd28-4d2d-b17a-6c45ddbeb299 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 845.561020] env[60764]: DEBUG nova.network.neutron [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Successfully created port: 5e2391b0-93c6-4bf2-9f70-4208b14dc39d {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 845.931940] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Acquiring lock "ce8f8161-623c-4f88-8846-8f3b5a4ceabe" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 845.932229] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Lock "ce8f8161-623c-4f88-8846-8f3b5a4ceabe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 846.580745] env[60764]: DEBUG nova.compute.manager [req-86dab0af-64cf-4cec-9012-e77e5f6c5e36 req-1845851c-54ce-4f4e-95e5-0185a8c81b0a service nova] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Received event network-changed-a1f88f2c-4533-4c02-ba84-252ae2327711 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 846.581023] env[60764]: DEBUG nova.compute.manager [req-86dab0af-64cf-4cec-9012-e77e5f6c5e36 req-1845851c-54ce-4f4e-95e5-0185a8c81b0a service nova] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Refreshing instance network info cache due to event network-changed-a1f88f2c-4533-4c02-ba84-252ae2327711. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 846.581178] env[60764]: DEBUG oslo_concurrency.lockutils [req-86dab0af-64cf-4cec-9012-e77e5f6c5e36 req-1845851c-54ce-4f4e-95e5-0185a8c81b0a service nova] Acquiring lock "refresh_cache-c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 846.581414] env[60764]: DEBUG oslo_concurrency.lockutils [req-86dab0af-64cf-4cec-9012-e77e5f6c5e36 req-1845851c-54ce-4f4e-95e5-0185a8c81b0a service nova] Acquired lock "refresh_cache-c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 846.581606] env[60764]: DEBUG nova.network.neutron [req-86dab0af-64cf-4cec-9012-e77e5f6c5e36 req-1845851c-54ce-4f4e-95e5-0185a8c81b0a service nova] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Refreshing network info cache for port a1f88f2c-4533-4c02-ba84-252ae2327711 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 846.656638] env[60764]: DEBUG nova.network.neutron [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Successfully updated port: 5e2391b0-93c6-4bf2-9f70-4208b14dc39d {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 846.668223] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Acquiring lock "refresh_cache-7a843233-c56c-4d87-aeb0-2ffaa441b021" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 846.668380] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Acquired lock "refresh_cache-7a843233-c56c-4d87-aeb0-2ffaa441b021" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 846.668532] env[60764]: DEBUG nova.network.neutron [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 846.745045] env[60764]: DEBUG nova.network.neutron [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 846.918426] env[60764]: DEBUG nova.compute.manager [req-86f51c5d-ea7e-4a6d-9638-ee43b84d83e0 req-4c51635b-52ff-4bb3-a5b6-ff0366b21d02 service nova] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Received event network-vif-plugged-5e2391b0-93c6-4bf2-9f70-4208b14dc39d {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 846.918426] env[60764]: DEBUG oslo_concurrency.lockutils [req-86f51c5d-ea7e-4a6d-9638-ee43b84d83e0 req-4c51635b-52ff-4bb3-a5b6-ff0366b21d02 service nova] Acquiring lock "7a843233-c56c-4d87-aeb0-2ffaa441b021-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 846.918572] env[60764]: DEBUG oslo_concurrency.lockutils [req-86f51c5d-ea7e-4a6d-9638-ee43b84d83e0 req-4c51635b-52ff-4bb3-a5b6-ff0366b21d02 service nova] Lock "7a843233-c56c-4d87-aeb0-2ffaa441b021-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 846.918773] env[60764]: DEBUG oslo_concurrency.lockutils [req-86f51c5d-ea7e-4a6d-9638-ee43b84d83e0 req-4c51635b-52ff-4bb3-a5b6-ff0366b21d02 service nova] Lock "7a843233-c56c-4d87-aeb0-2ffaa441b021-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 846.919259] env[60764]: DEBUG nova.compute.manager [req-86f51c5d-ea7e-4a6d-9638-ee43b84d83e0 req-4c51635b-52ff-4bb3-a5b6-ff0366b21d02 service nova] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] No waiting events found dispatching network-vif-plugged-5e2391b0-93c6-4bf2-9f70-4208b14dc39d {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 846.919520] env[60764]: WARNING nova.compute.manager [req-86f51c5d-ea7e-4a6d-9638-ee43b84d83e0 req-4c51635b-52ff-4bb3-a5b6-ff0366b21d02 service nova] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Received unexpected event network-vif-plugged-5e2391b0-93c6-4bf2-9f70-4208b14dc39d for instance with vm_state building and task_state spawning. [ 846.988077] env[60764]: DEBUG nova.network.neutron [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Updating instance_info_cache with network_info: [{"id": "5e2391b0-93c6-4bf2-9f70-4208b14dc39d", "address": "fa:16:3e:9b:b5:46", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5e2391b0-93", "ovs_interfaceid": "5e2391b0-93c6-4bf2-9f70-4208b14dc39d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 847.005218] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Releasing lock "refresh_cache-7a843233-c56c-4d87-aeb0-2ffaa441b021" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 847.005646] env[60764]: DEBUG nova.compute.manager [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Instance network_info: |[{"id": "5e2391b0-93c6-4bf2-9f70-4208b14dc39d", "address": "fa:16:3e:9b:b5:46", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5e2391b0-93", "ovs_interfaceid": "5e2391b0-93c6-4bf2-9f70-4208b14dc39d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 847.006528] env[60764]: DEBUG nova.network.neutron [req-86dab0af-64cf-4cec-9012-e77e5f6c5e36 req-1845851c-54ce-4f4e-95e5-0185a8c81b0a service nova] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Updated VIF entry in instance network info cache for port a1f88f2c-4533-4c02-ba84-252ae2327711. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 847.006858] env[60764]: DEBUG nova.network.neutron [req-86dab0af-64cf-4cec-9012-e77e5f6c5e36 req-1845851c-54ce-4f4e-95e5-0185a8c81b0a service nova] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Updating instance_info_cache with network_info: [{"id": "a1f88f2c-4533-4c02-ba84-252ae2327711", "address": "fa:16:3e:34:f9:95", "network": {"id": "eb4fb2ce-ddc7-4a2d-9c3c-4d1273be8c3f", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-947629294-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cf1b25f7bbec4186bff6532c2abfab84", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd7d0d95-6848-4e69-ac21-75f8db82a3b5", "external-id": "nsx-vlan-transportzone-272", "segmentation_id": 272, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa1f88f2c-45", "ovs_interfaceid": "a1f88f2c-4533-4c02-ba84-252ae2327711", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 847.011704] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9b:b5:46', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9f87a752-ebb0-49a4-a67b-e356fa45b89b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5e2391b0-93c6-4bf2-9f70-4208b14dc39d', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 847.017521] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Creating folder: Project (ef985aa716f94330ae11535fffddf02a). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 847.018979] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4e81e46f-28a2-4dd5-b6ef-adf80cd74966 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.021178] env[60764]: DEBUG oslo_concurrency.lockutils [req-86dab0af-64cf-4cec-9012-e77e5f6c5e36 req-1845851c-54ce-4f4e-95e5-0185a8c81b0a service nova] Releasing lock "refresh_cache-c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 847.031067] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Created folder: Project (ef985aa716f94330ae11535fffddf02a) in parent group-v449629. [ 847.031562] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Creating folder: Instances. Parent ref: group-v449685. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 847.031840] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c2260389-8890-477b-b051-f74d7b6332fb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.040735] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Created folder: Instances in parent group-v449685. [ 847.040970] env[60764]: DEBUG oslo.service.loopingcall [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 847.041178] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 847.041550] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1193f2a4-2de1-415e-ae60-712c2e8427c3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.061089] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 847.061089] env[60764]: value = "task-2204915" [ 847.061089] env[60764]: _type = "Task" [ 847.061089] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 847.068824] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204915, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 847.571485] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204915, 'name': CreateVM_Task, 'duration_secs': 0.371921} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 847.571688] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 847.572384] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 847.572567] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 847.572897] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 847.573172] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-36ed19ce-f09a-4bb3-b9d7-18fc2c91ea9a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.578048] env[60764]: DEBUG oslo_vmware.api [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Waiting for the task: (returnval){ [ 847.578048] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5260d163-f6f5-7bb7-f475-7a50f9273ac3" [ 847.578048] env[60764]: _type = "Task" [ 847.578048] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 847.585710] env[60764]: DEBUG oslo_vmware.api [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5260d163-f6f5-7bb7-f475-7a50f9273ac3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 848.090860] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 848.091129] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 848.091392] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 848.941760] env[60764]: DEBUG nova.compute.manager [req-84f035d7-decc-4145-a1af-a49027bcbf5a req-de6a5ac5-2cc2-4342-ab38-129e5ac65a9e service nova] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Received event network-changed-5e2391b0-93c6-4bf2-9f70-4208b14dc39d {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 848.942071] env[60764]: DEBUG nova.compute.manager [req-84f035d7-decc-4145-a1af-a49027bcbf5a req-de6a5ac5-2cc2-4342-ab38-129e5ac65a9e service nova] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Refreshing instance network info cache due to event network-changed-5e2391b0-93c6-4bf2-9f70-4208b14dc39d. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 848.942293] env[60764]: DEBUG oslo_concurrency.lockutils [req-84f035d7-decc-4145-a1af-a49027bcbf5a req-de6a5ac5-2cc2-4342-ab38-129e5ac65a9e service nova] Acquiring lock "refresh_cache-7a843233-c56c-4d87-aeb0-2ffaa441b021" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 848.942421] env[60764]: DEBUG oslo_concurrency.lockutils [req-84f035d7-decc-4145-a1af-a49027bcbf5a req-de6a5ac5-2cc2-4342-ab38-129e5ac65a9e service nova] Acquired lock "refresh_cache-7a843233-c56c-4d87-aeb0-2ffaa441b021" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 848.942471] env[60764]: DEBUG nova.network.neutron [req-84f035d7-decc-4145-a1af-a49027bcbf5a req-de6a5ac5-2cc2-4342-ab38-129e5ac65a9e service nova] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Refreshing network info cache for port 5e2391b0-93c6-4bf2-9f70-4208b14dc39d {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 849.262429] env[60764]: DEBUG nova.network.neutron [req-84f035d7-decc-4145-a1af-a49027bcbf5a req-de6a5ac5-2cc2-4342-ab38-129e5ac65a9e service nova] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Updated VIF entry in instance network info cache for port 5e2391b0-93c6-4bf2-9f70-4208b14dc39d. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 849.262738] env[60764]: DEBUG nova.network.neutron [req-84f035d7-decc-4145-a1af-a49027bcbf5a req-de6a5ac5-2cc2-4342-ab38-129e5ac65a9e service nova] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Updating instance_info_cache with network_info: [{"id": "5e2391b0-93c6-4bf2-9f70-4208b14dc39d", "address": "fa:16:3e:9b:b5:46", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5e2391b0-93", "ovs_interfaceid": "5e2391b0-93c6-4bf2-9f70-4208b14dc39d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 849.273313] env[60764]: DEBUG oslo_concurrency.lockutils [req-84f035d7-decc-4145-a1af-a49027bcbf5a req-de6a5ac5-2cc2-4342-ab38-129e5ac65a9e service nova] Releasing lock "refresh_cache-7a843233-c56c-4d87-aeb0-2ffaa441b021" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 851.329844] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 851.330241] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 851.330241] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 851.352318] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 851.352515] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 851.352633] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 851.352759] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 851.352881] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b846d9ae-759a-4898-9ede-091819325701] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 851.353012] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 851.353136] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 851.353254] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 851.353370] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 851.353491] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 851.595864] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "refresh_cache-c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 851.596015] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquired lock "refresh_cache-c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 851.596171] env[60764]: DEBUG nova.network.neutron [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Forcefully refreshing network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2004}} [ 851.596365] env[60764]: DEBUG nova.objects.instance [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lazy-loading 'info_cache' on Instance uuid c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e {{(pid=60764) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} [ 851.994276] env[60764]: DEBUG nova.network.neutron [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Updating instance_info_cache with network_info: [{"id": "a1f88f2c-4533-4c02-ba84-252ae2327711", "address": "fa:16:3e:34:f9:95", "network": {"id": "eb4fb2ce-ddc7-4a2d-9c3c-4d1273be8c3f", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-947629294-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cf1b25f7bbec4186bff6532c2abfab84", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd7d0d95-6848-4e69-ac21-75f8db82a3b5", "external-id": "nsx-vlan-transportzone-272", "segmentation_id": 272, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa1f88f2c-45", "ovs_interfaceid": "a1f88f2c-4533-4c02-ba84-252ae2327711", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 852.005264] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Releasing lock "refresh_cache-c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 852.009183] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Updated the network info_cache for instance {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9982}} [ 852.009183] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 852.009183] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 852.009183] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 852.019801] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 852.020084] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 852.020303] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 852.020468] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 852.021980] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ba05019-68ff-468a-8ee7-0344ddbaaff8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 852.031147] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f90db6e8-93a4-4bee-a8b4-edf65b97fd45 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 852.045953] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5cd70b69-4259-464b-bbd2-177293a8f414 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 852.052331] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb69c1d0-912f-4736-9cfd-e418866b3765 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 852.080998] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181244MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 852.081171] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 852.081364] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 852.157075] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance bea83327-9479-46b2-bd78-c81d72359e8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 852.157075] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7957eb49-d540-4c4e-a86a-1ea3631fb5ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 852.157075] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 4a0b0a82-3910-4201-b1f7-34c862667e3c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 852.157075] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ff2ef5e9-f543-4592-9896-e2c75369a971 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 852.157285] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b846d9ae-759a-4898-9ede-091819325701 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 852.157285] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 852.157285] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 437e0c0d-6d0e-4465-9651-14e420b646ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 852.157285] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 1f11c625-166f-4609-badf-da4dd9475c37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 852.157408] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6af62a04-4a13-46c7-a0b2-28768c789f23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 852.157408] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 852.157408] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7a843233-c56c-4d87-aeb0-2ffaa441b021 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 852.170446] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 74b4bba7-8568-4fc4-a744-395a3271abc8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.181620] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance dfd3e3af-90c9-420b-81ec-e9115c519016 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.192123] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 79528e3a-72e2-4d7e-913d-bb42d757fe64 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.202250] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 15d15d22-4ffa-43a1-ab5a-506637d1a3cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.212212] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance d2496c8d-c17c-4178-a2a9-85390aa0bb21 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.221364] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7cef2173-9a2d-4428-81d2-f13b807967c6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.230612] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e80dd396-f709-48d7-bc98-159b175f5593 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.240410] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b2e4096c-0edc-44ac-a4b6-3a32e0466c54 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.250381] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 30aca955-c304-43e8-8da1-91cd7f4e1b38 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.260766] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c5627bb3-36c8-415f-bf4e-449adedd5ba6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.269902] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 5bd8caaf-f61b-4df2-abd0-3da5259ae829 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.280676] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 9d4328ec-dab2-41c8-88da-82df6f2ae17f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.291573] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 12c9b68c-740c-4555-915d-b23c8b5f0473 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.301472] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance d7a99b7b-faa7-4904-9841-0ea582af96ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.311142] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 8cd907ae-697b-4fe3-86d4-e2e9f38ae424 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.320043] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 67b9f11f-bb3f-43d6-bed7-54cbb2fd5e82 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.331884] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 3b0f00e3-207a-416b-b971-c687df536a85 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.341054] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f926a47b-5252-41a9-9987-027858179887 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.350800] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ce8f8161-623c-4f88-8846-8f3b5a4ceabe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 852.351048] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 11 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 852.351197] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1920MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=11 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 852.679809] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f87c62e2-f76a-4a76-89c1-294f8e695a3c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 852.687701] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d98b42da-b478-461e-933f-13bdb475cb43 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 852.718187] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b19a678d-cf9a-468d-8a84-fd3704f3d0eb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 852.725615] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6aee1ea-52a0-4096-a222-89627bd48bdb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 852.739772] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 852.750165] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 852.763710] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 852.763916] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.683s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 853.087796] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 853.111123] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 854.329895] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 855.325087] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 855.329718] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 856.330607] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 856.330893] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 862.647124] env[60764]: DEBUG oslo_concurrency.lockutils [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Acquiring lock "c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 862.647124] env[60764]: DEBUG oslo_concurrency.lockutils [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Lock "c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 862.647458] env[60764]: DEBUG oslo_concurrency.lockutils [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Acquiring lock "c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 862.647458] env[60764]: DEBUG oslo_concurrency.lockutils [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Lock "c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 862.647564] env[60764]: DEBUG oslo_concurrency.lockutils [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Lock "c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 862.651256] env[60764]: INFO nova.compute.manager [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Terminating instance [ 862.652426] env[60764]: DEBUG nova.compute.manager [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 862.652644] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Powering off the VM {{(pid=60764) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 862.653134] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-9f97e6b6-df70-497e-98d4-e4c6f2dca2c6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.662541] env[60764]: DEBUG oslo_vmware.api [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Waiting for the task: (returnval){ [ 862.662541] env[60764]: value = "task-2204916" [ 862.662541] env[60764]: _type = "Task" [ 862.662541] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 862.672763] env[60764]: DEBUG oslo_vmware.api [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Task: {'id': task-2204916, 'name': PowerOffVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 863.173352] env[60764]: DEBUG oslo_vmware.api [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Task: {'id': task-2204916, 'name': PowerOffVM_Task, 'duration_secs': 0.181772} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 863.173690] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Powered off the VM {{(pid=60764) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 863.173927] env[60764]: DEBUG nova.virt.vmwareapi.volumeops [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Volume detach. Driver type: vmdk {{(pid=60764) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 863.174173] env[60764]: DEBUG nova.virt.vmwareapi.volumeops [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-449673', 'volume_id': '387072a8-7c57-43cc-8a5f-d4eb1eff6cd9', 'name': 'volume-387072a8-7c57-43cc-8a5f-d4eb1eff6cd9', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e', 'attached_at': '', 'detached_at': '', 'volume_id': '387072a8-7c57-43cc-8a5f-d4eb1eff6cd9', 'serial': '387072a8-7c57-43cc-8a5f-d4eb1eff6cd9'} {{(pid=60764) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 863.175000] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7aa462fa-28ae-422a-aab4-16b580d8db09 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.195243] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e38b4c84-31a1-4bcf-b8b5-486e816b295c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.202368] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29caab4f-c085-47e6-bbc0-5f9aea90a3c4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.221851] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83b19e4f-ee33-411f-8735-fe469c6476ce {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.237750] env[60764]: DEBUG nova.virt.vmwareapi.volumeops [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] The volume has not been displaced from its original location: [datastore2] volume-387072a8-7c57-43cc-8a5f-d4eb1eff6cd9/volume-387072a8-7c57-43cc-8a5f-d4eb1eff6cd9.vmdk. No consolidation needed. {{(pid=60764) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 863.244282] env[60764]: DEBUG nova.virt.vmwareapi.volumeops [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Reconfiguring VM instance instance-00000021 to detach disk 2000 {{(pid=60764) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 863.244644] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-14979cde-b468-495b-9bd7-36e7345ed248 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.264368] env[60764]: DEBUG oslo_vmware.api [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Waiting for the task: (returnval){ [ 863.264368] env[60764]: value = "task-2204917" [ 863.264368] env[60764]: _type = "Task" [ 863.264368] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 863.272292] env[60764]: DEBUG oslo_vmware.api [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Task: {'id': task-2204917, 'name': ReconfigVM_Task} progress is 5%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 863.775045] env[60764]: DEBUG oslo_vmware.api [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Task: {'id': task-2204917, 'name': ReconfigVM_Task, 'duration_secs': 0.151541} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 863.775045] env[60764]: DEBUG nova.virt.vmwareapi.volumeops [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Reconfigured VM instance instance-00000021 to detach disk 2000 {{(pid=60764) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 863.779054] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-78c95523-2f0a-4fc8-8dda-e76e8c5621f0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.794021] env[60764]: DEBUG oslo_vmware.api [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Waiting for the task: (returnval){ [ 863.794021] env[60764]: value = "task-2204918" [ 863.794021] env[60764]: _type = "Task" [ 863.794021] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 863.801538] env[60764]: DEBUG oslo_vmware.api [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Task: {'id': task-2204918, 'name': ReconfigVM_Task} progress is 5%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 864.304445] env[60764]: DEBUG oslo_vmware.api [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Task: {'id': task-2204918, 'name': ReconfigVM_Task, 'duration_secs': 0.131555} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 864.304693] env[60764]: DEBUG nova.virt.vmwareapi.volumeops [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-449673', 'volume_id': '387072a8-7c57-43cc-8a5f-d4eb1eff6cd9', 'name': 'volume-387072a8-7c57-43cc-8a5f-d4eb1eff6cd9', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e', 'attached_at': '', 'detached_at': '', 'volume_id': '387072a8-7c57-43cc-8a5f-d4eb1eff6cd9', 'serial': '387072a8-7c57-43cc-8a5f-d4eb1eff6cd9'} {{(pid=60764) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 864.305027] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 864.305758] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1ca370d-a9c1-4567-9068-17007e1fa86c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 864.312150] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 864.312378] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7306ab2e-cb8e-4f9f-a05a-b8097df8bd35 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 864.373998] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 864.373998] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 864.374211] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Deleting the datastore file [datastore2] c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 864.374474] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-04804a10-0cc5-4df8-9f83-0e6843d2f468 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 864.381264] env[60764]: DEBUG oslo_vmware.api [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Waiting for the task: (returnval){ [ 864.381264] env[60764]: value = "task-2204920" [ 864.381264] env[60764]: _type = "Task" [ 864.381264] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 864.389402] env[60764]: DEBUG oslo_vmware.api [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Task: {'id': task-2204920, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 864.890763] env[60764]: DEBUG oslo_vmware.api [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Task: {'id': task-2204920, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.08105} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 864.891652] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 864.891652] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 864.891785] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 864.891991] env[60764]: INFO nova.compute.manager [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Took 2.24 seconds to destroy the instance on the hypervisor. [ 864.892269] env[60764]: DEBUG oslo.service.loopingcall [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 864.892486] env[60764]: DEBUG nova.compute.manager [-] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 864.892585] env[60764]: DEBUG nova.network.neutron [-] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 865.614886] env[60764]: DEBUG nova.network.neutron [-] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 865.630263] env[60764]: INFO nova.compute.manager [-] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Took 0.74 seconds to deallocate network for instance. [ 865.707509] env[60764]: INFO nova.compute.manager [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Took 0.08 seconds to detach 1 volumes for instance. [ 865.710110] env[60764]: DEBUG nova.compute.manager [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Deleting volume: 387072a8-7c57-43cc-8a5f-d4eb1eff6cd9 {{(pid=60764) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3221}} [ 865.729603] env[60764]: DEBUG nova.compute.manager [req-c113adc7-bf8f-47de-ba67-b748b8fc66a8 req-c4940de4-35f2-4eab-a8fe-69530bef1a36 service nova] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Received event network-vif-deleted-a1f88f2c-4533-4c02-ba84-252ae2327711 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 865.790075] env[60764]: DEBUG oslo_concurrency.lockutils [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 865.790438] env[60764]: DEBUG oslo_concurrency.lockutils [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 865.790627] env[60764]: DEBUG nova.objects.instance [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Lazy-loading 'resources' on Instance uuid c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e {{(pid=60764) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} [ 866.280271] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f226192d-da15-4954-9949-7ce3c9e8e721 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 866.287990] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91f9c5dd-cc27-40c5-8606-eba8b244ae5b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 866.318782] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c86b2f2d-1b2c-498c-bfef-6e5dd0710768 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 866.326018] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb9e281a-4b08-41d2-8722-b0bc98ec2922 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 866.338881] env[60764]: DEBUG nova.compute.provider_tree [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 866.347313] env[60764]: DEBUG nova.scheduler.client.report [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 866.368264] env[60764]: DEBUG oslo_concurrency.lockutils [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.578s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 866.389815] env[60764]: INFO nova.scheduler.client.report [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Deleted allocations for instance c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e [ 866.441757] env[60764]: DEBUG oslo_concurrency.lockutils [None req-56703340-35e8-44ae-a063-602f746d92be tempest-ServersTestBootFromVolume-788721634 tempest-ServersTestBootFromVolume-788721634-project-member] Lock "c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.795s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 874.679545] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e018a361-fc91-49e3-962e-84d00cf37b33 tempest-ServerAddressesNegativeTestJSON-2136288688 tempest-ServerAddressesNegativeTestJSON-2136288688-project-member] Acquiring lock "9c01648d-4b7d-460a-b73d-e7324cf251e8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 874.679896] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e018a361-fc91-49e3-962e-84d00cf37b33 tempest-ServerAddressesNegativeTestJSON-2136288688 tempest-ServerAddressesNegativeTestJSON-2136288688-project-member] Lock "9c01648d-4b7d-460a-b73d-e7324cf251e8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 882.895326] env[60764]: WARNING oslo_vmware.rw_handles [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 882.895326] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 882.895326] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 882.895326] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 882.895326] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 882.895326] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 882.895326] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 882.895326] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 882.895326] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 882.895326] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 882.895326] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 882.895326] env[60764]: ERROR oslo_vmware.rw_handles [ 882.895946] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/a2e2f1fa-58e5-41cb-935e-e2a10a0bf949/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 882.897787] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 882.898043] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Copying Virtual Disk [datastore2] vmware_temp/a2e2f1fa-58e5-41cb-935e-e2a10a0bf949/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/a2e2f1fa-58e5-41cb-935e-e2a10a0bf949/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 882.898329] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-70e081d0-2af3-4296-a9b3-0259915b1d0a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 882.906695] env[60764]: DEBUG oslo_vmware.api [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Waiting for the task: (returnval){ [ 882.906695] env[60764]: value = "task-2204922" [ 882.906695] env[60764]: _type = "Task" [ 882.906695] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 882.916495] env[60764]: DEBUG oslo_vmware.api [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Task: {'id': task-2204922, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 883.417057] env[60764]: DEBUG oslo_vmware.exceptions [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 883.417057] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 883.417272] env[60764]: ERROR nova.compute.manager [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 883.417272] env[60764]: Faults: ['InvalidArgument'] [ 883.417272] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Traceback (most recent call last): [ 883.417272] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 883.417272] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] yield resources [ 883.417272] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 883.417272] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] self.driver.spawn(context, instance, image_meta, [ 883.417272] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 883.417272] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 883.417272] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 883.417272] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] self._fetch_image_if_missing(context, vi) [ 883.417272] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 883.417688] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] image_cache(vi, tmp_image_ds_loc) [ 883.417688] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 883.417688] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] vm_util.copy_virtual_disk( [ 883.417688] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 883.417688] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] session._wait_for_task(vmdk_copy_task) [ 883.417688] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 883.417688] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] return self.wait_for_task(task_ref) [ 883.417688] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 883.417688] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] return evt.wait() [ 883.417688] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 883.417688] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] result = hub.switch() [ 883.417688] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 883.417688] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] return self.greenlet.switch() [ 883.418123] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 883.418123] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] self.f(*self.args, **self.kw) [ 883.418123] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 883.418123] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] raise exceptions.translate_fault(task_info.error) [ 883.418123] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 883.418123] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Faults: ['InvalidArgument'] [ 883.418123] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] [ 883.418123] env[60764]: INFO nova.compute.manager [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Terminating instance [ 883.419059] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 883.419273] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 883.419509] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1e5a4efe-b448-47f2-a034-1b0cc11df8ca {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.421799] env[60764]: DEBUG nova.compute.manager [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 883.421994] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 883.422737] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e53865e8-7b11-4653-b1c6-758609903a27 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.429331] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 883.429566] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b382ce05-a7d5-4480-84a9-09979013cd55 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.431624] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 883.431798] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 883.432836] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3ab702fd-a1d4-435e-b67c-69c50106f0f4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.438123] env[60764]: DEBUG oslo_vmware.api [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Waiting for the task: (returnval){ [ 883.438123] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]526ac565-0381-e310-7cce-393271fffcb1" [ 883.438123] env[60764]: _type = "Task" [ 883.438123] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 883.445777] env[60764]: DEBUG oslo_vmware.api [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]526ac565-0381-e310-7cce-393271fffcb1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 883.496470] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 883.496702] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 883.496899] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Deleting the datastore file [datastore2] bea83327-9479-46b2-bd78-c81d72359e8a {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 883.497190] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a7cd77cc-dd06-4895-a536-9bda4d8b74c1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.503381] env[60764]: DEBUG oslo_vmware.api [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Waiting for the task: (returnval){ [ 883.503381] env[60764]: value = "task-2204924" [ 883.503381] env[60764]: _type = "Task" [ 883.503381] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 883.510912] env[60764]: DEBUG oslo_vmware.api [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Task: {'id': task-2204924, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 883.948540] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 883.948803] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Creating directory with path [datastore2] vmware_temp/468ab49d-f179-416b-be91-8c08e59a367a/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 883.949055] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9244f45d-9518-46eb-8d67-319168dcc1da {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.960512] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Created directory with path [datastore2] vmware_temp/468ab49d-f179-416b-be91-8c08e59a367a/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 883.960743] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Fetch image to [datastore2] vmware_temp/468ab49d-f179-416b-be91-8c08e59a367a/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 883.960991] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/468ab49d-f179-416b-be91-8c08e59a367a/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 883.961776] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33ead7f7-42de-4f09-8d8d-1a05c9f35590 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.968644] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4495620-e340-439d-9967-7f2e9613e1ef {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.977682] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7edf5e2-12cd-47a1-aec9-7d8348080d30 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 884.012027] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82df26c6-b0b6-4929-907a-aa76ba4173c4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 884.019068] env[60764]: DEBUG oslo_vmware.api [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Task: {'id': task-2204924, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077842} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 884.021011] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 884.021011] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 884.021011] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 884.021208] env[60764]: INFO nova.compute.manager [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 884.022919] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-38ad2024-dca2-41db-a91a-89d38a3e562f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 884.024891] env[60764]: DEBUG nova.compute.claims [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 884.025079] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 884.025296] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 884.047142] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 884.102452] env[60764]: DEBUG oslo_vmware.rw_handles [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/468ab49d-f179-416b-be91-8c08e59a367a/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 884.166304] env[60764]: DEBUG oslo_vmware.rw_handles [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 884.166438] env[60764]: DEBUG oslo_vmware.rw_handles [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/468ab49d-f179-416b-be91-8c08e59a367a/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 884.478333] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acc1b1ec-960c-466b-831c-a1c0174f420a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 884.485792] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2e67c9c-e28a-4ee1-8c0e-04b08e8ee3f9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 884.514794] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9fc5049-6477-450e-a52d-34c6543a2348 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 884.521498] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05934376-dde9-446b-a112-550f6dc0c109 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 884.534219] env[60764]: DEBUG nova.compute.provider_tree [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 884.543457] env[60764]: DEBUG nova.scheduler.client.report [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 884.559417] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.534s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 884.559954] env[60764]: ERROR nova.compute.manager [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 884.559954] env[60764]: Faults: ['InvalidArgument'] [ 884.559954] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Traceback (most recent call last): [ 884.559954] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 884.559954] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] self.driver.spawn(context, instance, image_meta, [ 884.559954] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 884.559954] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 884.559954] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 884.559954] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] self._fetch_image_if_missing(context, vi) [ 884.559954] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 884.559954] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] image_cache(vi, tmp_image_ds_loc) [ 884.559954] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 884.560499] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] vm_util.copy_virtual_disk( [ 884.560499] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 884.560499] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] session._wait_for_task(vmdk_copy_task) [ 884.560499] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 884.560499] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] return self.wait_for_task(task_ref) [ 884.560499] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 884.560499] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] return evt.wait() [ 884.560499] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 884.560499] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] result = hub.switch() [ 884.560499] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 884.560499] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] return self.greenlet.switch() [ 884.560499] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 884.560499] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] self.f(*self.args, **self.kw) [ 884.560840] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 884.560840] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] raise exceptions.translate_fault(task_info.error) [ 884.560840] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 884.560840] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Faults: ['InvalidArgument'] [ 884.560840] env[60764]: ERROR nova.compute.manager [instance: bea83327-9479-46b2-bd78-c81d72359e8a] [ 884.560840] env[60764]: DEBUG nova.compute.utils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 884.562045] env[60764]: DEBUG nova.compute.manager [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Build of instance bea83327-9479-46b2-bd78-c81d72359e8a was re-scheduled: A specified parameter was not correct: fileType [ 884.562045] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 884.562478] env[60764]: DEBUG nova.compute.manager [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 884.562668] env[60764]: DEBUG nova.compute.manager [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 884.562818] env[60764]: DEBUG nova.compute.manager [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 884.562974] env[60764]: DEBUG nova.network.neutron [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 884.882524] env[60764]: DEBUG nova.network.neutron [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 884.900384] env[60764]: INFO nova.compute.manager [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Took 0.34 seconds to deallocate network for instance. [ 885.021835] env[60764]: INFO nova.scheduler.client.report [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Deleted allocations for instance bea83327-9479-46b2-bd78-c81d72359e8a [ 885.041703] env[60764]: DEBUG oslo_concurrency.lockutils [None req-25cd1715-c45b-4d09-83dd-8ada4e9a47ab tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "bea83327-9479-46b2-bd78-c81d72359e8a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 342.130s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 885.042972] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ccee0870-4cd5-4dbf-9868-60dd30f99ac4 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "bea83327-9479-46b2-bd78-c81d72359e8a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 144.612s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 885.043219] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ccee0870-4cd5-4dbf-9868-60dd30f99ac4 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquiring lock "bea83327-9479-46b2-bd78-c81d72359e8a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 885.043433] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ccee0870-4cd5-4dbf-9868-60dd30f99ac4 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "bea83327-9479-46b2-bd78-c81d72359e8a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 885.043688] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ccee0870-4cd5-4dbf-9868-60dd30f99ac4 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "bea83327-9479-46b2-bd78-c81d72359e8a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 885.045801] env[60764]: INFO nova.compute.manager [None req-ccee0870-4cd5-4dbf-9868-60dd30f99ac4 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Terminating instance [ 885.047554] env[60764]: DEBUG nova.compute.manager [None req-ccee0870-4cd5-4dbf-9868-60dd30f99ac4 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 885.047675] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ccee0870-4cd5-4dbf-9868-60dd30f99ac4 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 885.048364] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-002efa76-1e7e-4c4d-9965-87d159dd1fab {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 885.054220] env[60764]: DEBUG nova.compute.manager [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 885.060707] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c75dfe3-57f9-40cf-aeda-94e799944a44 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 885.089087] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-ccee0870-4cd5-4dbf-9868-60dd30f99ac4 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance bea83327-9479-46b2-bd78-c81d72359e8a could not be found. [ 885.089296] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ccee0870-4cd5-4dbf-9868-60dd30f99ac4 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 885.089467] env[60764]: INFO nova.compute.manager [None req-ccee0870-4cd5-4dbf-9868-60dd30f99ac4 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 885.089738] env[60764]: DEBUG oslo.service.loopingcall [None req-ccee0870-4cd5-4dbf-9868-60dd30f99ac4 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 885.090351] env[60764]: DEBUG nova.compute.manager [-] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 885.090351] env[60764]: DEBUG nova.network.neutron [-] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 885.107395] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 885.107629] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 885.109201] env[60764]: INFO nova.compute.claims [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 885.123293] env[60764]: DEBUG nova.network.neutron [-] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 885.140139] env[60764]: INFO nova.compute.manager [-] [instance: bea83327-9479-46b2-bd78-c81d72359e8a] Took 0.05 seconds to deallocate network for instance. [ 885.245469] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ccee0870-4cd5-4dbf-9868-60dd30f99ac4 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "bea83327-9479-46b2-bd78-c81d72359e8a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.202s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 885.536931] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3526b05-8e32-4068-8e5c-0ab9c6a8facb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 885.545026] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b889c97f-2b63-44b4-ad46-e422af0170ff {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 885.574660] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab3e5b22-f2df-4236-961b-4be40362ea3b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 885.581481] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8abd0d32-87e6-4090-a88f-e3f49a07141d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 885.594101] env[60764]: DEBUG nova.compute.provider_tree [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 885.603265] env[60764]: DEBUG nova.scheduler.client.report [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 885.618350] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.511s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 885.618823] env[60764]: DEBUG nova.compute.manager [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 885.654671] env[60764]: DEBUG nova.compute.utils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 885.655544] env[60764]: DEBUG nova.compute.manager [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 885.655722] env[60764]: DEBUG nova.network.neutron [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 885.667083] env[60764]: DEBUG nova.compute.manager [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 885.716966] env[60764]: DEBUG nova.policy [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '87d8cf59659a4a96b289651abc4547fb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '24be54e7930c4f8f86ef0415b3073e43', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 885.729562] env[60764]: DEBUG nova.compute.manager [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 885.779291] env[60764]: DEBUG nova.virt.hardware [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 885.779291] env[60764]: DEBUG nova.virt.hardware [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 885.779291] env[60764]: DEBUG nova.virt.hardware [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 885.779487] env[60764]: DEBUG nova.virt.hardware [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 885.779487] env[60764]: DEBUG nova.virt.hardware [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 885.779487] env[60764]: DEBUG nova.virt.hardware [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 885.779487] env[60764]: DEBUG nova.virt.hardware [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 885.779487] env[60764]: DEBUG nova.virt.hardware [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 885.779591] env[60764]: DEBUG nova.virt.hardware [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 885.779591] env[60764]: DEBUG nova.virt.hardware [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 885.779591] env[60764]: DEBUG nova.virt.hardware [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 885.779591] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89882565-9559-4a7c-8156-7854a8073007 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 885.779591] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d9b96d1-63ff-41e4-ab97-50f59f374b91 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 886.166681] env[60764]: DEBUG nova.network.neutron [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Successfully created port: 6c0b4c7b-0ad9-48e4-b777-68f15e04a06e {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 886.475814] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquiring lock "aad42e7f-24c2-400e-8a1c-6baae2081e29" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 886.476082] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "aad42e7f-24c2-400e-8a1c-6baae2081e29" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 887.017113] env[60764]: DEBUG nova.network.neutron [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Successfully updated port: 6c0b4c7b-0ad9-48e4-b777-68f15e04a06e {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 887.031067] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Acquiring lock "refresh_cache-74b4bba7-8568-4fc4-a744-395a3271abc8" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 887.031233] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Acquired lock "refresh_cache-74b4bba7-8568-4fc4-a744-395a3271abc8" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 887.031380] env[60764]: DEBUG nova.network.neutron [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 887.089438] env[60764]: DEBUG nova.compute.manager [req-1fae436e-c371-4abd-8b58-4192f103a317 req-d0366ae7-4981-48c3-93ff-ca1868b5928f service nova] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Received event network-vif-plugged-6c0b4c7b-0ad9-48e4-b777-68f15e04a06e {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 887.091501] env[60764]: DEBUG oslo_concurrency.lockutils [req-1fae436e-c371-4abd-8b58-4192f103a317 req-d0366ae7-4981-48c3-93ff-ca1868b5928f service nova] Acquiring lock "74b4bba7-8568-4fc4-a744-395a3271abc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 887.091695] env[60764]: DEBUG oslo_concurrency.lockutils [req-1fae436e-c371-4abd-8b58-4192f103a317 req-d0366ae7-4981-48c3-93ff-ca1868b5928f service nova] Lock "74b4bba7-8568-4fc4-a744-395a3271abc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 887.091871] env[60764]: DEBUG oslo_concurrency.lockutils [req-1fae436e-c371-4abd-8b58-4192f103a317 req-d0366ae7-4981-48c3-93ff-ca1868b5928f service nova] Lock "74b4bba7-8568-4fc4-a744-395a3271abc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 887.092054] env[60764]: DEBUG nova.compute.manager [req-1fae436e-c371-4abd-8b58-4192f103a317 req-d0366ae7-4981-48c3-93ff-ca1868b5928f service nova] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] No waiting events found dispatching network-vif-plugged-6c0b4c7b-0ad9-48e4-b777-68f15e04a06e {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 887.092224] env[60764]: WARNING nova.compute.manager [req-1fae436e-c371-4abd-8b58-4192f103a317 req-d0366ae7-4981-48c3-93ff-ca1868b5928f service nova] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Received unexpected event network-vif-plugged-6c0b4c7b-0ad9-48e4-b777-68f15e04a06e for instance with vm_state building and task_state spawning. [ 887.093152] env[60764]: DEBUG nova.network.neutron [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 887.515508] env[60764]: DEBUG nova.network.neutron [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Updating instance_info_cache with network_info: [{"id": "6c0b4c7b-0ad9-48e4-b777-68f15e04a06e", "address": "fa:16:3e:99:2f:74", "network": {"id": "12d59f7c-08e7-4b06-a190-cbcfacf25b0f", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1633127363-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "24be54e7930c4f8f86ef0415b3073e43", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bec1528b-3e87-477b-8ab2-02696ad47e66", "external-id": "nsx-vlan-transportzone-180", "segmentation_id": 180, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6c0b4c7b-0a", "ovs_interfaceid": "6c0b4c7b-0ad9-48e4-b777-68f15e04a06e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 887.527610] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Releasing lock "refresh_cache-74b4bba7-8568-4fc4-a744-395a3271abc8" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 887.527911] env[60764]: DEBUG nova.compute.manager [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Instance network_info: |[{"id": "6c0b4c7b-0ad9-48e4-b777-68f15e04a06e", "address": "fa:16:3e:99:2f:74", "network": {"id": "12d59f7c-08e7-4b06-a190-cbcfacf25b0f", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1633127363-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "24be54e7930c4f8f86ef0415b3073e43", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bec1528b-3e87-477b-8ab2-02696ad47e66", "external-id": "nsx-vlan-transportzone-180", "segmentation_id": 180, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6c0b4c7b-0a", "ovs_interfaceid": "6c0b4c7b-0ad9-48e4-b777-68f15e04a06e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 887.528316] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:99:2f:74', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'bec1528b-3e87-477b-8ab2-02696ad47e66', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6c0b4c7b-0ad9-48e4-b777-68f15e04a06e', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 887.535869] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Creating folder: Project (24be54e7930c4f8f86ef0415b3073e43). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 887.536514] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fb3875bf-dd1e-4259-869a-038d009d7c21 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 887.547838] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Created folder: Project (24be54e7930c4f8f86ef0415b3073e43) in parent group-v449629. [ 887.548040] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Creating folder: Instances. Parent ref: group-v449688. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 887.548284] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2d7b0ee4-aa04-4f85-b178-eeb0a7427723 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 887.558124] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Created folder: Instances in parent group-v449688. [ 887.558361] env[60764]: DEBUG oslo.service.loopingcall [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 887.558546] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 887.558745] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ca2a56c3-f8c9-4b80-91b9-8402954fa5d8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 887.578414] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 887.578414] env[60764]: value = "task-2204927" [ 887.578414] env[60764]: _type = "Task" [ 887.578414] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 887.585919] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204927, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 888.088537] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204927, 'name': CreateVM_Task, 'duration_secs': 0.313332} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 888.088719] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 888.089411] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 888.089576] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 888.089909] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 888.090180] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-35103b8e-cef3-4667-bf8c-c474af5ed296 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 888.094730] env[60764]: DEBUG oslo_vmware.api [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Waiting for the task: (returnval){ [ 888.094730] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5223fa43-ffd1-65bd-a7ff-f7fc0c78a815" [ 888.094730] env[60764]: _type = "Task" [ 888.094730] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 888.101930] env[60764]: DEBUG oslo_vmware.api [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5223fa43-ffd1-65bd-a7ff-f7fc0c78a815, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 888.606683] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 888.606947] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 888.607169] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 889.173446] env[60764]: DEBUG nova.compute.manager [req-f0c9169b-6433-4ea8-a6b0-d2c6e53c3a83 req-ea221a27-abd3-4481-b384-ca8ff1a3f651 service nova] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Received event network-changed-6c0b4c7b-0ad9-48e4-b777-68f15e04a06e {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 889.173606] env[60764]: DEBUG nova.compute.manager [req-f0c9169b-6433-4ea8-a6b0-d2c6e53c3a83 req-ea221a27-abd3-4481-b384-ca8ff1a3f651 service nova] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Refreshing instance network info cache due to event network-changed-6c0b4c7b-0ad9-48e4-b777-68f15e04a06e. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 889.173825] env[60764]: DEBUG oslo_concurrency.lockutils [req-f0c9169b-6433-4ea8-a6b0-d2c6e53c3a83 req-ea221a27-abd3-4481-b384-ca8ff1a3f651 service nova] Acquiring lock "refresh_cache-74b4bba7-8568-4fc4-a744-395a3271abc8" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 889.173961] env[60764]: DEBUG oslo_concurrency.lockutils [req-f0c9169b-6433-4ea8-a6b0-d2c6e53c3a83 req-ea221a27-abd3-4481-b384-ca8ff1a3f651 service nova] Acquired lock "refresh_cache-74b4bba7-8568-4fc4-a744-395a3271abc8" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 889.174139] env[60764]: DEBUG nova.network.neutron [req-f0c9169b-6433-4ea8-a6b0-d2c6e53c3a83 req-ea221a27-abd3-4481-b384-ca8ff1a3f651 service nova] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Refreshing network info cache for port 6c0b4c7b-0ad9-48e4-b777-68f15e04a06e {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 889.477832] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0a8efb8-2afb-438c-9c5c-14ba336a4de4 tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Acquiring lock "7a843233-c56c-4d87-aeb0-2ffaa441b021" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 889.495791] env[60764]: DEBUG nova.network.neutron [req-f0c9169b-6433-4ea8-a6b0-d2c6e53c3a83 req-ea221a27-abd3-4481-b384-ca8ff1a3f651 service nova] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Updated VIF entry in instance network info cache for port 6c0b4c7b-0ad9-48e4-b777-68f15e04a06e. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 889.496616] env[60764]: DEBUG nova.network.neutron [req-f0c9169b-6433-4ea8-a6b0-d2c6e53c3a83 req-ea221a27-abd3-4481-b384-ca8ff1a3f651 service nova] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Updating instance_info_cache with network_info: [{"id": "6c0b4c7b-0ad9-48e4-b777-68f15e04a06e", "address": "fa:16:3e:99:2f:74", "network": {"id": "12d59f7c-08e7-4b06-a190-cbcfacf25b0f", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1633127363-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "24be54e7930c4f8f86ef0415b3073e43", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bec1528b-3e87-477b-8ab2-02696ad47e66", "external-id": "nsx-vlan-transportzone-180", "segmentation_id": 180, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6c0b4c7b-0a", "ovs_interfaceid": "6c0b4c7b-0ad9-48e4-b777-68f15e04a06e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 889.505617] env[60764]: DEBUG oslo_concurrency.lockutils [req-f0c9169b-6433-4ea8-a6b0-d2c6e53c3a83 req-ea221a27-abd3-4481-b384-ca8ff1a3f651 service nova] Releasing lock "refresh_cache-74b4bba7-8568-4fc4-a744-395a3271abc8" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 911.331672] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 911.331947] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 911.331986] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 911.356146] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 911.356320] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 911.356456] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 911.356578] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b846d9ae-759a-4898-9ede-091819325701] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 911.356699] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 911.357720] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 911.357720] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 911.357720] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 911.357720] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 911.357720] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 911.358154] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 912.329916] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 912.330250] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 912.342108] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 912.342377] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 912.342493] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 912.342647] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 912.344473] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-851824c9-f950-4f3a-9a1b-7ac79da723c1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 912.353703] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c90cbdd-6ed0-4c1d-8acb-b54b1a5f7648 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 912.368947] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b03b3bfc-5ffa-460e-814f-83e3fb7a6531 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 912.376149] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b4d123c-2968-4bb6-9dff-0b0543a35640 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 912.406858] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181062MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 912.407032] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 912.407242] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 912.487670] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7957eb49-d540-4c4e-a86a-1ea3631fb5ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 912.487843] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 4a0b0a82-3910-4201-b1f7-34c862667e3c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 912.487973] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ff2ef5e9-f543-4592-9896-e2c75369a971 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 912.488113] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b846d9ae-759a-4898-9ede-091819325701 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 912.488232] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 912.488347] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 437e0c0d-6d0e-4465-9651-14e420b646ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 912.488462] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 1f11c625-166f-4609-badf-da4dd9475c37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 912.488573] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6af62a04-4a13-46c7-a0b2-28768c789f23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 912.488685] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7a843233-c56c-4d87-aeb0-2ffaa441b021 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 912.488797] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 74b4bba7-8568-4fc4-a744-395a3271abc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 912.501014] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance dfd3e3af-90c9-420b-81ec-e9115c519016 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.513657] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 79528e3a-72e2-4d7e-913d-bb42d757fe64 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.525551] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 15d15d22-4ffa-43a1-ab5a-506637d1a3cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.536622] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance d2496c8d-c17c-4178-a2a9-85390aa0bb21 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.547029] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7cef2173-9a2d-4428-81d2-f13b807967c6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.558961] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e80dd396-f709-48d7-bc98-159b175f5593 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.569470] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b2e4096c-0edc-44ac-a4b6-3a32e0466c54 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.579501] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 30aca955-c304-43e8-8da1-91cd7f4e1b38 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.589680] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c5627bb3-36c8-415f-bf4e-449adedd5ba6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.599433] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 5bd8caaf-f61b-4df2-abd0-3da5259ae829 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.609954] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 9d4328ec-dab2-41c8-88da-82df6f2ae17f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.622733] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 12c9b68c-740c-4555-915d-b23c8b5f0473 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.633422] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance d7a99b7b-faa7-4904-9841-0ea582af96ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.643645] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 8cd907ae-697b-4fe3-86d4-e2e9f38ae424 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.655190] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 67b9f11f-bb3f-43d6-bed7-54cbb2fd5e82 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.665473] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 3b0f00e3-207a-416b-b971-c687df536a85 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.676927] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f926a47b-5252-41a9-9987-027858179887 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.687524] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ce8f8161-623c-4f88-8846-8f3b5a4ceabe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.697482] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 9c01648d-4b7d-460a-b73d-e7324cf251e8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.708667] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aad42e7f-24c2-400e-8a1c-6baae2081e29 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 912.708915] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 912.709077] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 913.049192] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04d8cfa8-61af-4078-afc1-6ecf04216413 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 913.056833] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7dcedd0-f17f-48f2-96bf-d4e201423a6a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 913.085508] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78bf9816-919f-4b96-9846-12d34abbf002 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 913.092422] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebd9e503-a222-402c-9b22-f25a0f419f89 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 913.105809] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 913.113865] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 913.126396] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 913.126562] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.719s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 914.127605] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 914.330053] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 915.330327] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 915.330590] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 917.325366] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 918.329979] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 918.329979] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 932.913688] env[60764]: WARNING oslo_vmware.rw_handles [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 932.913688] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 932.913688] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 932.913688] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 932.913688] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 932.913688] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 932.913688] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 932.913688] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 932.913688] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 932.913688] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 932.913688] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 932.913688] env[60764]: ERROR oslo_vmware.rw_handles [ 932.914346] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/468ab49d-f179-416b-be91-8c08e59a367a/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 932.917233] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 932.917508] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Copying Virtual Disk [datastore2] vmware_temp/468ab49d-f179-416b-be91-8c08e59a367a/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/468ab49d-f179-416b-be91-8c08e59a367a/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 932.917792] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9f55ecfa-8d87-4378-b2c9-317e87373fa0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 932.925288] env[60764]: DEBUG oslo_vmware.api [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Waiting for the task: (returnval){ [ 932.925288] env[60764]: value = "task-2204928" [ 932.925288] env[60764]: _type = "Task" [ 932.925288] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 932.933178] env[60764]: DEBUG oslo_vmware.api [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Task: {'id': task-2204928, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 933.436942] env[60764]: DEBUG oslo_vmware.exceptions [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 933.437263] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 933.437810] env[60764]: ERROR nova.compute.manager [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 933.437810] env[60764]: Faults: ['InvalidArgument'] [ 933.437810] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Traceback (most recent call last): [ 933.437810] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 933.437810] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] yield resources [ 933.437810] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 933.437810] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] self.driver.spawn(context, instance, image_meta, [ 933.437810] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 933.437810] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] self._vmops.spawn(context, instance, image_meta, injected_files, [ 933.437810] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 933.437810] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] self._fetch_image_if_missing(context, vi) [ 933.437810] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 933.438145] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] image_cache(vi, tmp_image_ds_loc) [ 933.438145] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 933.438145] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] vm_util.copy_virtual_disk( [ 933.438145] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 933.438145] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] session._wait_for_task(vmdk_copy_task) [ 933.438145] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 933.438145] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] return self.wait_for_task(task_ref) [ 933.438145] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 933.438145] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] return evt.wait() [ 933.438145] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 933.438145] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] result = hub.switch() [ 933.438145] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 933.438145] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] return self.greenlet.switch() [ 933.438481] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 933.438481] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] self.f(*self.args, **self.kw) [ 933.438481] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 933.438481] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] raise exceptions.translate_fault(task_info.error) [ 933.438481] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 933.438481] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Faults: ['InvalidArgument'] [ 933.438481] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] [ 933.438481] env[60764]: INFO nova.compute.manager [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Terminating instance [ 933.439697] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 933.439900] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 933.440870] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e551dcbf-9727-4be6-a440-f85f923691d9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.442314] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Acquiring lock "refresh_cache-ff2ef5e9-f543-4592-9896-e2c75369a971" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 933.442472] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Acquired lock "refresh_cache-ff2ef5e9-f543-4592-9896-e2c75369a971" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 933.442637] env[60764]: DEBUG nova.network.neutron [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 933.449200] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 933.449378] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 933.450553] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-55af4468-ea3a-461d-ba72-3e692909f6ea {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.457873] env[60764]: DEBUG oslo_vmware.api [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Waiting for the task: (returnval){ [ 933.457873] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52c4e75a-1f86-3d24-90a5-9ea4640a3ea0" [ 933.457873] env[60764]: _type = "Task" [ 933.457873] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 933.465584] env[60764]: DEBUG oslo_vmware.api [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52c4e75a-1f86-3d24-90a5-9ea4640a3ea0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 933.478165] env[60764]: DEBUG nova.network.neutron [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 933.579527] env[60764]: DEBUG nova.network.neutron [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 933.588477] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Releasing lock "refresh_cache-ff2ef5e9-f543-4592-9896-e2c75369a971" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 933.588879] env[60764]: DEBUG nova.compute.manager [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 933.589086] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 933.590201] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adfc6a0d-f195-4538-8c87-43a4b6cb7234 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.597908] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 933.598140] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2d9882d4-43bd-4762-8702-25ee91d42a29 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.629828] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 933.630034] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 933.630217] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Deleting the datastore file [datastore2] ff2ef5e9-f543-4592-9896-e2c75369a971 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 933.630443] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-18d0d58f-e4a3-4b49-b399-faaacf63050d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.636770] env[60764]: DEBUG oslo_vmware.api [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Waiting for the task: (returnval){ [ 933.636770] env[60764]: value = "task-2204930" [ 933.636770] env[60764]: _type = "Task" [ 933.636770] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 933.643913] env[60764]: DEBUG oslo_vmware.api [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Task: {'id': task-2204930, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 933.968421] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 933.969822] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Creating directory with path [datastore2] vmware_temp/85b07b19-0187-43ce-b06c-2ebb4f588f4b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 933.969822] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8387b9ab-c426-40c0-9e6f-8600288ffd39 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.980195] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Created directory with path [datastore2] vmware_temp/85b07b19-0187-43ce-b06c-2ebb4f588f4b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 933.980396] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Fetch image to [datastore2] vmware_temp/85b07b19-0187-43ce-b06c-2ebb4f588f4b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 933.980564] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/85b07b19-0187-43ce-b06c-2ebb4f588f4b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 933.981468] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32c1d129-61e8-4ca3-a6b3-99f2d4b2a07a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.988509] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2df360e6-b585-48f8-9e22-24bc438a3c70 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 933.999008] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-459544ea-444d-4591-a817-413904e67098 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 934.029909] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f131aad-e000-4933-afc7-57179aaf113d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 934.035760] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-829407f4-7f1d-4819-a98c-ad45aca88726 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 934.068681] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 934.123057] env[60764]: DEBUG oslo_vmware.rw_handles [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/85b07b19-0187-43ce-b06c-2ebb4f588f4b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 934.185502] env[60764]: DEBUG oslo_vmware.rw_handles [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 934.185772] env[60764]: DEBUG oslo_vmware.rw_handles [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/85b07b19-0187-43ce-b06c-2ebb4f588f4b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 934.190818] env[60764]: DEBUG oslo_vmware.api [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Task: {'id': task-2204930, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.033253} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 934.191169] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 934.191458] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 934.191665] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 934.191907] env[60764]: INFO nova.compute.manager [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Took 0.60 seconds to destroy the instance on the hypervisor. [ 934.192253] env[60764]: DEBUG oslo.service.loopingcall [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 934.192526] env[60764]: DEBUG nova.compute.manager [-] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Skipping network deallocation for instance since networking was not requested. {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 934.195428] env[60764]: DEBUG nova.compute.claims [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 934.195663] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 934.195948] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 934.556489] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c4745bf-975a-471e-9300-cb8fdc37ac42 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 934.563914] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a9120c0-6b87-4ef6-9aa9-576821773569 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 934.594016] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f06add3-b184-4c75-a2e7-2124f3eba3d6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 934.601260] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4aedd609-1f74-46cd-8ddf-fa1230e1779e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 934.614270] env[60764]: DEBUG nova.compute.provider_tree [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 934.623553] env[60764]: DEBUG nova.scheduler.client.report [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 934.638473] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.442s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 934.639226] env[60764]: ERROR nova.compute.manager [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 934.639226] env[60764]: Faults: ['InvalidArgument'] [ 934.639226] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Traceback (most recent call last): [ 934.639226] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 934.639226] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] self.driver.spawn(context, instance, image_meta, [ 934.639226] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 934.639226] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] self._vmops.spawn(context, instance, image_meta, injected_files, [ 934.639226] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 934.639226] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] self._fetch_image_if_missing(context, vi) [ 934.639226] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 934.639226] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] image_cache(vi, tmp_image_ds_loc) [ 934.639226] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 934.639581] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] vm_util.copy_virtual_disk( [ 934.639581] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 934.639581] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] session._wait_for_task(vmdk_copy_task) [ 934.639581] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 934.639581] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] return self.wait_for_task(task_ref) [ 934.639581] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 934.639581] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] return evt.wait() [ 934.639581] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 934.639581] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] result = hub.switch() [ 934.639581] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 934.639581] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] return self.greenlet.switch() [ 934.639581] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 934.639581] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] self.f(*self.args, **self.kw) [ 934.639918] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 934.639918] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] raise exceptions.translate_fault(task_info.error) [ 934.639918] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 934.639918] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Faults: ['InvalidArgument'] [ 934.639918] env[60764]: ERROR nova.compute.manager [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] [ 934.639918] env[60764]: DEBUG nova.compute.utils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 934.641222] env[60764]: DEBUG nova.compute.manager [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Build of instance ff2ef5e9-f543-4592-9896-e2c75369a971 was re-scheduled: A specified parameter was not correct: fileType [ 934.641222] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 934.641919] env[60764]: DEBUG nova.compute.manager [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 934.641919] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Acquiring lock "refresh_cache-ff2ef5e9-f543-4592-9896-e2c75369a971" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 934.642051] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Acquired lock "refresh_cache-ff2ef5e9-f543-4592-9896-e2c75369a971" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 934.642116] env[60764]: DEBUG nova.network.neutron [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 934.667752] env[60764]: DEBUG nova.network.neutron [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 934.725321] env[60764]: DEBUG nova.network.neutron [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 934.734295] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Releasing lock "refresh_cache-ff2ef5e9-f543-4592-9896-e2c75369a971" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 934.734471] env[60764]: DEBUG nova.compute.manager [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 934.734650] env[60764]: DEBUG nova.compute.manager [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Skipping network deallocation for instance since networking was not requested. {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 934.817196] env[60764]: INFO nova.scheduler.client.report [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Deleted allocations for instance ff2ef5e9-f543-4592-9896-e2c75369a971 [ 934.839100] env[60764]: DEBUG oslo_concurrency.lockutils [None req-696cc89f-0689-4745-94c8-e632f4235012 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Lock "ff2ef5e9-f543-4592-9896-e2c75369a971" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 380.650s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 934.840291] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Lock "ff2ef5e9-f543-4592-9896-e2c75369a971" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 183.092s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 934.840510] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Acquiring lock "ff2ef5e9-f543-4592-9896-e2c75369a971-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 934.840711] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Lock "ff2ef5e9-f543-4592-9896-e2c75369a971-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 934.840873] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Lock "ff2ef5e9-f543-4592-9896-e2c75369a971-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 934.842819] env[60764]: INFO nova.compute.manager [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Terminating instance [ 934.844335] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Acquiring lock "refresh_cache-ff2ef5e9-f543-4592-9896-e2c75369a971" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 934.844488] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Acquired lock "refresh_cache-ff2ef5e9-f543-4592-9896-e2c75369a971" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 934.844668] env[60764]: DEBUG nova.network.neutron [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 934.855654] env[60764]: DEBUG nova.compute.manager [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 934.875509] env[60764]: DEBUG nova.network.neutron [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 934.908498] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 934.908786] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 934.910410] env[60764]: INFO nova.compute.claims [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 934.976116] env[60764]: DEBUG nova.network.neutron [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 934.985706] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Releasing lock "refresh_cache-ff2ef5e9-f543-4592-9896-e2c75369a971" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 934.986484] env[60764]: DEBUG nova.compute.manager [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 934.986897] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 934.987417] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5b746e6a-e655-4c6c-83a9-8a18c57b4e77 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 935.002255] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48dac9f4-e886-44f4-941e-fa5b511f9fee {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 935.035474] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ff2ef5e9-f543-4592-9896-e2c75369a971 could not be found. [ 935.035690] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 935.035869] env[60764]: INFO nova.compute.manager [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Took 0.05 seconds to destroy the instance on the hypervisor. [ 935.036252] env[60764]: DEBUG oslo.service.loopingcall [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 935.038717] env[60764]: DEBUG nova.compute.manager [-] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 935.038824] env[60764]: DEBUG nova.network.neutron [-] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 935.055841] env[60764]: DEBUG nova.network.neutron [-] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 935.065726] env[60764]: DEBUG nova.network.neutron [-] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 935.075935] env[60764]: INFO nova.compute.manager [-] [instance: ff2ef5e9-f543-4592-9896-e2c75369a971] Took 0.04 seconds to deallocate network for instance. [ 935.170670] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ebadce86-bcba-4168-8320-281536b562e4 tempest-ServerDiagnosticsV248Test-511208848 tempest-ServerDiagnosticsV248Test-511208848-project-member] Lock "ff2ef5e9-f543-4592-9896-e2c75369a971" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.330s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 935.338443] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca00f6b9-b44b-4662-a0af-89baa18b4366 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 935.346670] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bd6983b-34de-4411-a18e-a47a22a98561 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 935.378169] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9731927a-d10e-4f26-807a-c4a881083806 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 935.384173] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-beec6cc4-1de5-4c9f-87eb-669ca06a15b8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 935.397742] env[60764]: DEBUG nova.compute.provider_tree [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 935.406581] env[60764]: DEBUG nova.scheduler.client.report [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 935.420076] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.511s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 935.421276] env[60764]: DEBUG nova.compute.manager [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 935.469909] env[60764]: DEBUG nova.compute.utils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 935.471413] env[60764]: DEBUG nova.compute.manager [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 935.471617] env[60764]: DEBUG nova.network.neutron [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 935.482411] env[60764]: DEBUG nova.compute.manager [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 935.548257] env[60764]: DEBUG nova.compute.manager [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 935.557721] env[60764]: DEBUG nova.policy [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1154fa431dad4ae1ae467fc3ea6206b1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4c4f5a1b557e4c31b54b7f87223a20d8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 935.573200] env[60764]: DEBUG nova.virt.hardware [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 935.575427] env[60764]: DEBUG nova.virt.hardware [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 935.575427] env[60764]: DEBUG nova.virt.hardware [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 935.575427] env[60764]: DEBUG nova.virt.hardware [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 935.575427] env[60764]: DEBUG nova.virt.hardware [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 935.575427] env[60764]: DEBUG nova.virt.hardware [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 935.575670] env[60764]: DEBUG nova.virt.hardware [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 935.575670] env[60764]: DEBUG nova.virt.hardware [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 935.575856] env[60764]: DEBUG nova.virt.hardware [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 935.576142] env[60764]: DEBUG nova.virt.hardware [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 935.576417] env[60764]: DEBUG nova.virt.hardware [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 935.577425] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9234d0ae-399d-4429-8253-fe46f4bfcbcb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 935.585475] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-410ad5dc-d82d-4f33-b6ac-9ed3bc9c3019 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 935.987801] env[60764]: DEBUG nova.network.neutron [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Successfully created port: 43aa7a01-5308-4533-92ff-318e5aa61e57 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 936.834334] env[60764]: DEBUG nova.network.neutron [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Successfully updated port: 43aa7a01-5308-4533-92ff-318e5aa61e57 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 936.851024] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "refresh_cache-dfd3e3af-90c9-420b-81ec-e9115c519016" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 936.851024] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquired lock "refresh_cache-dfd3e3af-90c9-420b-81ec-e9115c519016" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 936.851024] env[60764]: DEBUG nova.network.neutron [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 936.907647] env[60764]: DEBUG nova.network.neutron [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 936.925163] env[60764]: DEBUG nova.compute.manager [req-036bf1dc-9758-456f-8480-537f633565b4 req-632b8752-865f-4107-8ef5-06fed5183c16 service nova] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Received event network-vif-plugged-43aa7a01-5308-4533-92ff-318e5aa61e57 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 936.928758] env[60764]: DEBUG oslo_concurrency.lockutils [req-036bf1dc-9758-456f-8480-537f633565b4 req-632b8752-865f-4107-8ef5-06fed5183c16 service nova] Acquiring lock "dfd3e3af-90c9-420b-81ec-e9115c519016-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 936.928995] env[60764]: DEBUG oslo_concurrency.lockutils [req-036bf1dc-9758-456f-8480-537f633565b4 req-632b8752-865f-4107-8ef5-06fed5183c16 service nova] Lock "dfd3e3af-90c9-420b-81ec-e9115c519016-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.004s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 936.929233] env[60764]: DEBUG oslo_concurrency.lockutils [req-036bf1dc-9758-456f-8480-537f633565b4 req-632b8752-865f-4107-8ef5-06fed5183c16 service nova] Lock "dfd3e3af-90c9-420b-81ec-e9115c519016-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 936.929349] env[60764]: DEBUG nova.compute.manager [req-036bf1dc-9758-456f-8480-537f633565b4 req-632b8752-865f-4107-8ef5-06fed5183c16 service nova] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] No waiting events found dispatching network-vif-plugged-43aa7a01-5308-4533-92ff-318e5aa61e57 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 936.929515] env[60764]: WARNING nova.compute.manager [req-036bf1dc-9758-456f-8480-537f633565b4 req-632b8752-865f-4107-8ef5-06fed5183c16 service nova] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Received unexpected event network-vif-plugged-43aa7a01-5308-4533-92ff-318e5aa61e57 for instance with vm_state building and task_state spawning. [ 937.364833] env[60764]: DEBUG nova.network.neutron [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Updating instance_info_cache with network_info: [{"id": "43aa7a01-5308-4533-92ff-318e5aa61e57", "address": "fa:16:3e:62:df:b9", "network": {"id": "d045d290-b078-42dc-b8c5-cc9de065ce4b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1646899007-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4c4f5a1b557e4c31b54b7f87223a20d8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d177c5b3-a5b1-4c78-854e-7e0dbf341ea1", "external-id": "nsx-vlan-transportzone-54", "segmentation_id": 54, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap43aa7a01-53", "ovs_interfaceid": "43aa7a01-5308-4533-92ff-318e5aa61e57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 937.378708] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Releasing lock "refresh_cache-dfd3e3af-90c9-420b-81ec-e9115c519016" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 937.378996] env[60764]: DEBUG nova.compute.manager [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Instance network_info: |[{"id": "43aa7a01-5308-4533-92ff-318e5aa61e57", "address": "fa:16:3e:62:df:b9", "network": {"id": "d045d290-b078-42dc-b8c5-cc9de065ce4b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1646899007-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4c4f5a1b557e4c31b54b7f87223a20d8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d177c5b3-a5b1-4c78-854e-7e0dbf341ea1", "external-id": "nsx-vlan-transportzone-54", "segmentation_id": 54, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap43aa7a01-53", "ovs_interfaceid": "43aa7a01-5308-4533-92ff-318e5aa61e57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 937.379393] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:62:df:b9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd177c5b3-a5b1-4c78-854e-7e0dbf341ea1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '43aa7a01-5308-4533-92ff-318e5aa61e57', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 937.386892] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Creating folder: Project (4c4f5a1b557e4c31b54b7f87223a20d8). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 937.387430] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-622f3b84-c175-4366-b9bd-0baf0d93fde3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.398428] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Created folder: Project (4c4f5a1b557e4c31b54b7f87223a20d8) in parent group-v449629. [ 937.398681] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Creating folder: Instances. Parent ref: group-v449691. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 937.398826] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c2e0abc5-ca36-4bad-a378-d8626f436b38 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.407470] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Created folder: Instances in parent group-v449691. [ 937.407691] env[60764]: DEBUG oslo.service.loopingcall [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 937.407863] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 937.408078] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4374690d-bf56-475e-9188-d807d7fd64c4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.428593] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 937.428593] env[60764]: value = "task-2204933" [ 937.428593] env[60764]: _type = "Task" [ 937.428593] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 937.436359] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204933, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 937.939126] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204933, 'name': CreateVM_Task, 'duration_secs': 0.309506} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 937.939126] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 937.939600] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 937.939695] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 937.940039] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 937.940294] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ba3b4805-4d7e-4fcb-a8d9-73cf8289546a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 937.944802] env[60764]: DEBUG oslo_vmware.api [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Waiting for the task: (returnval){ [ 937.944802] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52ac042d-4e09-a5f2-b480-ba75de9fd221" [ 937.944802] env[60764]: _type = "Task" [ 937.944802] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 937.952788] env[60764]: DEBUG oslo_vmware.api [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52ac042d-4e09-a5f2-b480-ba75de9fd221, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 938.456242] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 938.456504] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 938.456689] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 939.063446] env[60764]: DEBUG nova.compute.manager [req-c82ed477-bc78-4f21-9d15-d6b152a81b7b req-453e1d5e-5420-4e16-bd75-bccc983f1e7c service nova] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Received event network-changed-43aa7a01-5308-4533-92ff-318e5aa61e57 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 939.063659] env[60764]: DEBUG nova.compute.manager [req-c82ed477-bc78-4f21-9d15-d6b152a81b7b req-453e1d5e-5420-4e16-bd75-bccc983f1e7c service nova] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Refreshing instance network info cache due to event network-changed-43aa7a01-5308-4533-92ff-318e5aa61e57. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 939.063874] env[60764]: DEBUG oslo_concurrency.lockutils [req-c82ed477-bc78-4f21-9d15-d6b152a81b7b req-453e1d5e-5420-4e16-bd75-bccc983f1e7c service nova] Acquiring lock "refresh_cache-dfd3e3af-90c9-420b-81ec-e9115c519016" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 939.064024] env[60764]: DEBUG oslo_concurrency.lockutils [req-c82ed477-bc78-4f21-9d15-d6b152a81b7b req-453e1d5e-5420-4e16-bd75-bccc983f1e7c service nova] Acquired lock "refresh_cache-dfd3e3af-90c9-420b-81ec-e9115c519016" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 939.064216] env[60764]: DEBUG nova.network.neutron [req-c82ed477-bc78-4f21-9d15-d6b152a81b7b req-453e1d5e-5420-4e16-bd75-bccc983f1e7c service nova] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Refreshing network info cache for port 43aa7a01-5308-4533-92ff-318e5aa61e57 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 939.420461] env[60764]: DEBUG nova.network.neutron [req-c82ed477-bc78-4f21-9d15-d6b152a81b7b req-453e1d5e-5420-4e16-bd75-bccc983f1e7c service nova] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Updated VIF entry in instance network info cache for port 43aa7a01-5308-4533-92ff-318e5aa61e57. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 939.420823] env[60764]: DEBUG nova.network.neutron [req-c82ed477-bc78-4f21-9d15-d6b152a81b7b req-453e1d5e-5420-4e16-bd75-bccc983f1e7c service nova] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Updating instance_info_cache with network_info: [{"id": "43aa7a01-5308-4533-92ff-318e5aa61e57", "address": "fa:16:3e:62:df:b9", "network": {"id": "d045d290-b078-42dc-b8c5-cc9de065ce4b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1646899007-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4c4f5a1b557e4c31b54b7f87223a20d8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d177c5b3-a5b1-4c78-854e-7e0dbf341ea1", "external-id": "nsx-vlan-transportzone-54", "segmentation_id": 54, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap43aa7a01-53", "ovs_interfaceid": "43aa7a01-5308-4533-92ff-318e5aa61e57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 939.430523] env[60764]: DEBUG oslo_concurrency.lockutils [req-c82ed477-bc78-4f21-9d15-d6b152a81b7b req-453e1d5e-5420-4e16-bd75-bccc983f1e7c service nova] Releasing lock "refresh_cache-dfd3e3af-90c9-420b-81ec-e9115c519016" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 942.075611] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Acquiring lock "f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 942.076739] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Lock "f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 946.261910] env[60764]: DEBUG oslo_concurrency.lockutils [None req-021b55b2-a351-4bb3-9bd7-2e22c3e76bd8 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Acquiring lock "74b4bba7-8568-4fc4-a744-395a3271abc8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 948.287937] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d00705eb-25dc-4917-ba74-f6bfdff21186 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Acquiring lock "594e3624-e282-4695-a6a7-88ab1e2ddfff" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 948.288463] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d00705eb-25dc-4917-ba74-f6bfdff21186 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Lock "594e3624-e282-4695-a6a7-88ab1e2ddfff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 954.797655] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dee959c2-bbe7-444c-98c2-94036d087188 tempest-ServerShowV257Test-236663532 tempest-ServerShowV257Test-236663532-project-member] Acquiring lock "ba8d2109-e600-4992-b997-c998ae288b59" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 954.798031] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dee959c2-bbe7-444c-98c2-94036d087188 tempest-ServerShowV257Test-236663532 tempest-ServerShowV257Test-236663532-project-member] Lock "ba8d2109-e600-4992-b997-c998ae288b59" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 955.865026] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ff04037b-044e-4ffd-898d-f89290e17190 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Acquiring lock "d8838301-49a7-4291-8091-6fc90fabc7bb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 955.865337] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ff04037b-044e-4ffd-898d-f89290e17190 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Lock "d8838301-49a7-4291-8091-6fc90fabc7bb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 961.407022] env[60764]: DEBUG oslo_concurrency.lockutils [None req-931ef5a6-6237-4823-b4f4-d4270781bf5b tempest-ListServerFiltersTestJSON-1714896274 tempest-ListServerFiltersTestJSON-1714896274-project-member] Acquiring lock "69874f31-5316-4a8e-be6c-f77ac9f6ffbc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 961.407022] env[60764]: DEBUG oslo_concurrency.lockutils [None req-931ef5a6-6237-4823-b4f4-d4270781bf5b tempest-ListServerFiltersTestJSON-1714896274 tempest-ListServerFiltersTestJSON-1714896274-project-member] Lock "69874f31-5316-4a8e-be6c-f77ac9f6ffbc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 961.961905] env[60764]: DEBUG oslo_concurrency.lockutils [None req-afe52671-b4ed-4a57-80f1-0e3367827473 tempest-ListServerFiltersTestJSON-1714896274 tempest-ListServerFiltersTestJSON-1714896274-project-member] Acquiring lock "919607ac-6116-49a7-a575-aff30f9e4c86" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 961.962193] env[60764]: DEBUG oslo_concurrency.lockutils [None req-afe52671-b4ed-4a57-80f1-0e3367827473 tempest-ListServerFiltersTestJSON-1714896274 tempest-ListServerFiltersTestJSON-1714896274-project-member] Lock "919607ac-6116-49a7-a575-aff30f9e4c86" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 962.720241] env[60764]: DEBUG oslo_concurrency.lockutils [None req-285d52f0-159f-4f3c-92e1-573a0676dbe2 tempest-ListServerFiltersTestJSON-1714896274 tempest-ListServerFiltersTestJSON-1714896274-project-member] Acquiring lock "c0c847ad-0c16-4796-a28c-9efdd19b7096" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 962.720525] env[60764]: DEBUG oslo_concurrency.lockutils [None req-285d52f0-159f-4f3c-92e1-573a0676dbe2 tempest-ListServerFiltersTestJSON-1714896274 tempest-ListServerFiltersTestJSON-1714896274-project-member] Lock "c0c847ad-0c16-4796-a28c-9efdd19b7096" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 962.889111] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d704c046-69ae-4d93-ba68-785e58329b37 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "dfd3e3af-90c9-420b-81ec-e9115c519016" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 971.227445] env[60764]: DEBUG oslo_concurrency.lockutils [None req-4e271d84-cd3d-4668-a5da-9711aad0d67f tempest-AttachInterfacesV270Test-2106429919 tempest-AttachInterfacesV270Test-2106429919-project-member] Acquiring lock "33f526a8-730d-4264-b24a-1dd892343a15" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 971.227802] env[60764]: DEBUG oslo_concurrency.lockutils [None req-4e271d84-cd3d-4668-a5da-9711aad0d67f tempest-AttachInterfacesV270Test-2106429919 tempest-AttachInterfacesV270Test-2106429919-project-member] Lock "33f526a8-730d-4264-b24a-1dd892343a15" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 971.330339] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 971.330497] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 971.330598] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 971.352753] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 971.352952] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 971.353161] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b846d9ae-759a-4898-9ede-091819325701] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 971.353347] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 971.353481] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 971.353624] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 971.353752] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 971.353878] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 971.353996] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 971.354134] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 971.354252] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 972.329761] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 972.340608] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 972.340824] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 972.340978] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 972.341162] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 972.342726] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-815bfb7e-5ebd-432d-9a19-54a42b0819b7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 972.351439] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16168479-346f-49ef-ab10-9e7b86c37039 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 972.366740] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44015ca1-30f9-4a9b-9c9a-093de7517746 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 972.373203] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53cb2d95-a01d-4d5a-ac6a-9f8a7a9f791c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 972.401879] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181273MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 972.402021] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 972.402215] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 972.480129] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7957eb49-d540-4c4e-a86a-1ea3631fb5ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 972.480298] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 4a0b0a82-3910-4201-b1f7-34c862667e3c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 972.480427] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b846d9ae-759a-4898-9ede-091819325701 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 972.480549] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 972.480674] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 437e0c0d-6d0e-4465-9651-14e420b646ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 972.480791] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 1f11c625-166f-4609-badf-da4dd9475c37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 972.480909] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6af62a04-4a13-46c7-a0b2-28768c789f23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 972.481030] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7a843233-c56c-4d87-aeb0-2ffaa441b021 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 972.481148] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 74b4bba7-8568-4fc4-a744-395a3271abc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 972.481260] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance dfd3e3af-90c9-420b-81ec-e9115c519016 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 972.493751] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 79528e3a-72e2-4d7e-913d-bb42d757fe64 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.504029] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 15d15d22-4ffa-43a1-ab5a-506637d1a3cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.515346] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance d2496c8d-c17c-4178-a2a9-85390aa0bb21 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.525545] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7cef2173-9a2d-4428-81d2-f13b807967c6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.535516] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e80dd396-f709-48d7-bc98-159b175f5593 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.545886] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b2e4096c-0edc-44ac-a4b6-3a32e0466c54 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.561374] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 30aca955-c304-43e8-8da1-91cd7f4e1b38 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.572373] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c5627bb3-36c8-415f-bf4e-449adedd5ba6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.585893] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 5bd8caaf-f61b-4df2-abd0-3da5259ae829 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.596915] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 9d4328ec-dab2-41c8-88da-82df6f2ae17f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.606896] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 12c9b68c-740c-4555-915d-b23c8b5f0473 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.621926] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance d7a99b7b-faa7-4904-9841-0ea582af96ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.631658] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 8cd907ae-697b-4fe3-86d4-e2e9f38ae424 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.642779] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 67b9f11f-bb3f-43d6-bed7-54cbb2fd5e82 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.652452] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 3b0f00e3-207a-416b-b971-c687df536a85 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.661773] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f926a47b-5252-41a9-9987-027858179887 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.673468] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ce8f8161-623c-4f88-8846-8f3b5a4ceabe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.683312] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 9c01648d-4b7d-460a-b73d-e7324cf251e8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.692875] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aad42e7f-24c2-400e-8a1c-6baae2081e29 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.702667] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.712243] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 594e3624-e282-4695-a6a7-88ab1e2ddfff has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.721419] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ba8d2109-e600-4992-b997-c998ae288b59 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.730837] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance d8838301-49a7-4291-8091-6fc90fabc7bb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.740365] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 69874f31-5316-4a8e-be6c-f77ac9f6ffbc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.749636] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 919607ac-6116-49a7-a575-aff30f9e4c86 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.758877] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c0c847ad-0c16-4796-a28c-9efdd19b7096 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.771107] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 33f526a8-730d-4264-b24a-1dd892343a15 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 972.771107] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 972.771107] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 973.166825] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80acb2cd-e32c-45e9-82cd-8c90854c2483 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 973.174484] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ededc00-8d50-4459-86d2-8a55771fdd25 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 973.204099] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f9ae74e-032d-4509-bb01-dcff0dc6daa3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 973.211311] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afe70095-1c44-4e53-88e8-b4d15cd6c84d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 973.225267] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 973.238149] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 973.252506] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 973.252506] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.850s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 975.253576] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 975.324915] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 975.347594] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 976.330828] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 976.330828] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 977.325785] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 977.329378] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 979.329962] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 979.330260] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 979.577643] env[60764]: DEBUG oslo_concurrency.lockutils [None req-52288769-4b2d-45b5-9af8-2178ead9b69f tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Acquiring lock "f7cbb5ac-1fcf-457a-adea-bce8e9765699" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 979.577643] env[60764]: DEBUG oslo_concurrency.lockutils [None req-52288769-4b2d-45b5-9af8-2178ead9b69f tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "f7cbb5ac-1fcf-457a-adea-bce8e9765699" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 981.865019] env[60764]: WARNING oslo_vmware.rw_handles [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 981.865019] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 981.865019] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 981.865019] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 981.865019] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 981.865019] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 981.865019] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 981.865019] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 981.865019] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 981.865019] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 981.865019] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 981.865019] env[60764]: ERROR oslo_vmware.rw_handles [ 981.865019] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/85b07b19-0187-43ce-b06c-2ebb4f588f4b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 981.866011] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 981.866789] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Copying Virtual Disk [datastore2] vmware_temp/85b07b19-0187-43ce-b06c-2ebb4f588f4b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/85b07b19-0187-43ce-b06c-2ebb4f588f4b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 981.867126] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-197564cd-a0bd-41ad-84ee-a11ff4da9f6f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 981.875280] env[60764]: DEBUG oslo_vmware.api [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Waiting for the task: (returnval){ [ 981.875280] env[60764]: value = "task-2204934" [ 981.875280] env[60764]: _type = "Task" [ 981.875280] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 981.884454] env[60764]: DEBUG oslo_vmware.api [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Task: {'id': task-2204934, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 981.920528] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Acquiring lock "74709301-6eae-40c1-b987-4be9262ef7ce" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 981.920766] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Lock "74709301-6eae-40c1-b987-4be9262ef7ce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 982.386611] env[60764]: DEBUG oslo_vmware.exceptions [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 982.387073] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 982.387588] env[60764]: ERROR nova.compute.manager [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 982.387588] env[60764]: Faults: ['InvalidArgument'] [ 982.387588] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Traceback (most recent call last): [ 982.387588] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 982.387588] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] yield resources [ 982.387588] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 982.387588] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] self.driver.spawn(context, instance, image_meta, [ 982.387588] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 982.387588] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 982.387588] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 982.387588] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] self._fetch_image_if_missing(context, vi) [ 982.387588] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 982.387985] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] image_cache(vi, tmp_image_ds_loc) [ 982.387985] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 982.387985] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] vm_util.copy_virtual_disk( [ 982.387985] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 982.387985] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] session._wait_for_task(vmdk_copy_task) [ 982.387985] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 982.387985] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] return self.wait_for_task(task_ref) [ 982.387985] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 982.387985] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] return evt.wait() [ 982.387985] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 982.387985] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] result = hub.switch() [ 982.387985] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 982.387985] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] return self.greenlet.switch() [ 982.388419] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 982.388419] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] self.f(*self.args, **self.kw) [ 982.388419] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 982.388419] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] raise exceptions.translate_fault(task_info.error) [ 982.388419] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 982.388419] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Faults: ['InvalidArgument'] [ 982.388419] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] [ 982.388419] env[60764]: INFO nova.compute.manager [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Terminating instance [ 982.389875] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 982.390074] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 982.390800] env[60764]: DEBUG nova.compute.manager [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 982.391036] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 982.391443] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b62f6c65-a999-46e0-8233-68a11217c99c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 982.393767] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce4d641f-9687-4f38-bfe1-425427657c1c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 982.401038] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 982.402078] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7f25ede8-4abd-4a17-b271-b41b956a8e57 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 982.403561] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 982.403792] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 982.404549] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8288a2bf-0f48-4359-8c84-bc3c997a9e27 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 982.409551] env[60764]: DEBUG oslo_vmware.api [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Waiting for the task: (returnval){ [ 982.409551] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5283a477-216a-7e6d-05b7-9338053f5df9" [ 982.409551] env[60764]: _type = "Task" [ 982.409551] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 982.416795] env[60764]: DEBUG oslo_vmware.api [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5283a477-216a-7e6d-05b7-9338053f5df9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 982.481870] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 982.482163] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 982.482426] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Deleting the datastore file [datastore2] 4a0b0a82-3910-4201-b1f7-34c862667e3c {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 982.482699] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-21f97d2a-ea3f-4a7e-8bf5-94c3e9f9c178 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 982.488856] env[60764]: DEBUG oslo_vmware.api [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Waiting for the task: (returnval){ [ 982.488856] env[60764]: value = "task-2204936" [ 982.488856] env[60764]: _type = "Task" [ 982.488856] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 982.498424] env[60764]: DEBUG oslo_vmware.api [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Task: {'id': task-2204936, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 982.919891] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 982.920180] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Creating directory with path [datastore2] vmware_temp/c093d699-fb0d-4453-8498-d8c94f095702/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 982.920545] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-258e2b27-b604-4a18-ae22-499bce213ef7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 982.932490] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Created directory with path [datastore2] vmware_temp/c093d699-fb0d-4453-8498-d8c94f095702/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 982.932700] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Fetch image to [datastore2] vmware_temp/c093d699-fb0d-4453-8498-d8c94f095702/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 982.932871] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/c093d699-fb0d-4453-8498-d8c94f095702/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 982.933678] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b2b31f6-cf55-4237-bba1-4685f7fdf340 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 982.942202] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3a72905-def9-4ef2-ab27-8a02247f9781 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 982.952316] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93f5716e-23db-46d3-a99b-24c5a7eb506d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 982.987983] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a5cadc7-7a67-4116-9d2c-0360534476f0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 982.999799] env[60764]: DEBUG oslo_vmware.api [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Task: {'id': task-2204936, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067833} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 983.000073] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1e63f8c5-2e70-4e6b-b0ae-b9f82e059cd9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.001823] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 983.002024] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 983.002204] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 983.002376] env[60764]: INFO nova.compute.manager [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Took 0.61 seconds to destroy the instance on the hypervisor. [ 983.007902] env[60764]: DEBUG nova.compute.claims [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 983.007902] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 983.008039] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 983.024090] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 983.076672] env[60764]: DEBUG oslo_vmware.rw_handles [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c093d699-fb0d-4453-8498-d8c94f095702/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 983.141895] env[60764]: DEBUG oslo_vmware.rw_handles [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 983.142112] env[60764]: DEBUG oslo_vmware.rw_handles [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c093d699-fb0d-4453-8498-d8c94f095702/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 983.522127] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2de5fb6-9f55-4de9-a743-c3d43cf6cba7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.529467] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b35a7e73-c1de-460d-aa5f-2805441b0568 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.560301] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8cbe74b-5c04-4e62-b59f-e6407e4a83a4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.567421] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c798cc96-8e16-454e-8c71-6399a6699e7c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.580342] env[60764]: DEBUG nova.compute.provider_tree [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 983.589992] env[60764]: DEBUG nova.scheduler.client.report [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 983.606548] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.598s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 983.607127] env[60764]: ERROR nova.compute.manager [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 983.607127] env[60764]: Faults: ['InvalidArgument'] [ 983.607127] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Traceback (most recent call last): [ 983.607127] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 983.607127] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] self.driver.spawn(context, instance, image_meta, [ 983.607127] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 983.607127] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 983.607127] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 983.607127] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] self._fetch_image_if_missing(context, vi) [ 983.607127] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 983.607127] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] image_cache(vi, tmp_image_ds_loc) [ 983.607127] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 983.607467] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] vm_util.copy_virtual_disk( [ 983.607467] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 983.607467] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] session._wait_for_task(vmdk_copy_task) [ 983.607467] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 983.607467] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] return self.wait_for_task(task_ref) [ 983.607467] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 983.607467] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] return evt.wait() [ 983.607467] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 983.607467] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] result = hub.switch() [ 983.607467] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 983.607467] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] return self.greenlet.switch() [ 983.607467] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 983.607467] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] self.f(*self.args, **self.kw) [ 983.607760] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 983.607760] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] raise exceptions.translate_fault(task_info.error) [ 983.607760] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 983.607760] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Faults: ['InvalidArgument'] [ 983.607760] env[60764]: ERROR nova.compute.manager [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] [ 983.607878] env[60764]: DEBUG nova.compute.utils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 983.609268] env[60764]: DEBUG nova.compute.manager [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Build of instance 4a0b0a82-3910-4201-b1f7-34c862667e3c was re-scheduled: A specified parameter was not correct: fileType [ 983.609268] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 983.609633] env[60764]: DEBUG nova.compute.manager [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 983.609802] env[60764]: DEBUG nova.compute.manager [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 983.609953] env[60764]: DEBUG nova.compute.manager [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 983.610134] env[60764]: DEBUG nova.network.neutron [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 983.926257] env[60764]: DEBUG nova.network.neutron [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 983.940022] env[60764]: INFO nova.compute.manager [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Took 0.33 seconds to deallocate network for instance. [ 984.043635] env[60764]: INFO nova.scheduler.client.report [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Deleted allocations for instance 4a0b0a82-3910-4201-b1f7-34c862667e3c [ 984.068777] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e9680398-7940-4afb-9545-f8eeff318bd7 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Lock "4a0b0a82-3910-4201-b1f7-34c862667e3c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 430.802s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 984.069997] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9cee9a26-3a92-425d-a1a2-5ac1a54c3079 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Lock "4a0b0a82-3910-4201-b1f7-34c862667e3c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 232.386s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 984.070228] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9cee9a26-3a92-425d-a1a2-5ac1a54c3079 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Acquiring lock "4a0b0a82-3910-4201-b1f7-34c862667e3c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 984.070422] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9cee9a26-3a92-425d-a1a2-5ac1a54c3079 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Lock "4a0b0a82-3910-4201-b1f7-34c862667e3c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 984.070626] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9cee9a26-3a92-425d-a1a2-5ac1a54c3079 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Lock "4a0b0a82-3910-4201-b1f7-34c862667e3c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 984.072646] env[60764]: INFO nova.compute.manager [None req-9cee9a26-3a92-425d-a1a2-5ac1a54c3079 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Terminating instance [ 984.074440] env[60764]: DEBUG nova.compute.manager [None req-9cee9a26-3a92-425d-a1a2-5ac1a54c3079 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 984.076113] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-9cee9a26-3a92-425d-a1a2-5ac1a54c3079 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 984.076113] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5ed93829-f459-4193-9a44-3d87ec5f06a7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 984.080352] env[60764]: DEBUG nova.compute.manager [None req-46c1d467-41fe-43a8-b217-524ee57d9c5b tempest-ServersV294TestFqdnHostnames-693022799 tempest-ServersV294TestFqdnHostnames-693022799-project-member] [instance: 79528e3a-72e2-4d7e-913d-bb42d757fe64] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 984.087342] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5fc2ebd-4e9a-4c13-a2bb-044920e81e78 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 984.116809] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-9cee9a26-3a92-425d-a1a2-5ac1a54c3079 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4a0b0a82-3910-4201-b1f7-34c862667e3c could not be found. [ 984.117217] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-9cee9a26-3a92-425d-a1a2-5ac1a54c3079 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 984.117380] env[60764]: INFO nova.compute.manager [None req-9cee9a26-3a92-425d-a1a2-5ac1a54c3079 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 984.117624] env[60764]: DEBUG oslo.service.loopingcall [None req-9cee9a26-3a92-425d-a1a2-5ac1a54c3079 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 984.118029] env[60764]: DEBUG nova.compute.manager [None req-46c1d467-41fe-43a8-b217-524ee57d9c5b tempest-ServersV294TestFqdnHostnames-693022799 tempest-ServersV294TestFqdnHostnames-693022799-project-member] [instance: 79528e3a-72e2-4d7e-913d-bb42d757fe64] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 984.119799] env[60764]: DEBUG nova.compute.manager [-] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 984.119799] env[60764]: DEBUG nova.network.neutron [-] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 984.144753] env[60764]: DEBUG oslo_concurrency.lockutils [None req-46c1d467-41fe-43a8-b217-524ee57d9c5b tempest-ServersV294TestFqdnHostnames-693022799 tempest-ServersV294TestFqdnHostnames-693022799-project-member] Lock "79528e3a-72e2-4d7e-913d-bb42d757fe64" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.785s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 984.147888] env[60764]: DEBUG nova.network.neutron [-] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 984.156122] env[60764]: INFO nova.compute.manager [-] [instance: 4a0b0a82-3910-4201-b1f7-34c862667e3c] Took 0.04 seconds to deallocate network for instance. [ 984.158790] env[60764]: DEBUG nova.compute.manager [None req-c204e2a0-dd3b-488c-9457-66cb66b13916 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: 24977d06-906a-4000-9b8f-262085258c6b] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 984.195057] env[60764]: DEBUG nova.compute.manager [None req-c204e2a0-dd3b-488c-9457-66cb66b13916 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: 24977d06-906a-4000-9b8f-262085258c6b] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 984.225217] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c204e2a0-dd3b-488c-9457-66cb66b13916 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "24977d06-906a-4000-9b8f-262085258c6b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.716s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 984.237154] env[60764]: DEBUG nova.compute.manager [None req-4e3384b1-c280-4797-aea9-6d13f5ce9222 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: 15d15d22-4ffa-43a1-ab5a-506637d1a3cb] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 984.267424] env[60764]: DEBUG nova.compute.manager [None req-4e3384b1-c280-4797-aea9-6d13f5ce9222 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: 15d15d22-4ffa-43a1-ab5a-506637d1a3cb] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 984.281049] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9cee9a26-3a92-425d-a1a2-5ac1a54c3079 tempest-ServersAdminNegativeTestJSON-306532850 tempest-ServersAdminNegativeTestJSON-306532850-project-member] Lock "4a0b0a82-3910-4201-b1f7-34c862667e3c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.211s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 984.290388] env[60764]: DEBUG oslo_concurrency.lockutils [None req-4e3384b1-c280-4797-aea9-6d13f5ce9222 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "15d15d22-4ffa-43a1-ab5a-506637d1a3cb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.254s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 984.300147] env[60764]: DEBUG nova.compute.manager [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 984.352897] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 984.353151] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 984.354731] env[60764]: INFO nova.compute.claims [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 984.556988] env[60764]: DEBUG oslo_concurrency.lockutils [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Acquiring lock "d2496c8d-c17c-4178-a2a9-85390aa0bb21" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 984.839078] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5928abd5-443a-4876-8296-194b4f30bada {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 984.846928] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf08f919-a7e8-4b19-9dc1-17e3b686d234 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 984.877988] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edd67c28-efd4-4399-b22c-bd45b8450cb1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 984.885174] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3780aa34-1458-4a41-8fa0-23ec672e2dbc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 984.898189] env[60764]: DEBUG nova.compute.provider_tree [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 984.906476] env[60764]: DEBUG nova.scheduler.client.report [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 984.919672] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.566s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 984.920133] env[60764]: DEBUG nova.compute.manager [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 984.970563] env[60764]: DEBUG nova.compute.claims [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 984.970809] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 984.970988] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 985.463187] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ffe778f-9c6e-4d76-acbb-5da8cd92e36c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 985.470322] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2e10d57-aa7e-49d8-87f3-ee0ffed3bcf8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 985.503030] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a33482d3-0205-4ca4-a57f-bcdd2d8d8ae5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 985.510367] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08e19f8e-47c5-41b6-8963-8fa4f3bcc4a9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 985.523978] env[60764]: DEBUG nova.compute.provider_tree [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 985.532788] env[60764]: DEBUG nova.scheduler.client.report [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 985.547739] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.576s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 985.548134] env[60764]: DEBUG nova.compute.utils [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Conflict updating instance d2496c8d-c17c-4178-a2a9-85390aa0bb21. Expected: {'task_state': [None]}. Actual: {'task_state': 'deleting'} {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 985.549671] env[60764]: DEBUG nova.compute.manager [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Instance disappeared during build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2487}} [ 985.549840] env[60764]: DEBUG nova.compute.manager [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 985.550068] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Acquiring lock "refresh_cache-d2496c8d-c17c-4178-a2a9-85390aa0bb21" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 985.550218] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Acquired lock "refresh_cache-d2496c8d-c17c-4178-a2a9-85390aa0bb21" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 985.550375] env[60764]: DEBUG nova.network.neutron [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 985.578021] env[60764]: DEBUG nova.network.neutron [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 985.765107] env[60764]: DEBUG nova.network.neutron [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 985.777806] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Releasing lock "refresh_cache-d2496c8d-c17c-4178-a2a9-85390aa0bb21" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 985.777806] env[60764]: DEBUG nova.compute.manager [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 985.777806] env[60764]: DEBUG nova.compute.manager [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 985.777806] env[60764]: DEBUG nova.network.neutron [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 985.802281] env[60764]: DEBUG nova.network.neutron [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 985.814880] env[60764]: DEBUG nova.network.neutron [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 985.824962] env[60764]: INFO nova.compute.manager [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Took 0.05 seconds to deallocate network for instance. [ 985.912019] env[60764]: INFO nova.scheduler.client.report [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Deleted allocations for instance d2496c8d-c17c-4178-a2a9-85390aa0bb21 [ 985.912019] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a4971d17-99c0-43c5-9753-b3474ef2b122 tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Lock "d2496c8d-c17c-4178-a2a9-85390aa0bb21" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.381s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 985.912585] env[60764]: DEBUG oslo_concurrency.lockutils [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Lock "d2496c8d-c17c-4178-a2a9-85390aa0bb21" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 1.356s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 985.912794] env[60764]: DEBUG oslo_concurrency.lockutils [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Acquiring lock "d2496c8d-c17c-4178-a2a9-85390aa0bb21-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 985.912993] env[60764]: DEBUG oslo_concurrency.lockutils [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Lock "d2496c8d-c17c-4178-a2a9-85390aa0bb21-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 985.913168] env[60764]: DEBUG oslo_concurrency.lockutils [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Lock "d2496c8d-c17c-4178-a2a9-85390aa0bb21-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 985.914855] env[60764]: INFO nova.compute.manager [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Terminating instance [ 985.917523] env[60764]: DEBUG oslo_concurrency.lockutils [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Acquiring lock "refresh_cache-d2496c8d-c17c-4178-a2a9-85390aa0bb21" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 985.917680] env[60764]: DEBUG oslo_concurrency.lockutils [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Acquired lock "refresh_cache-d2496c8d-c17c-4178-a2a9-85390aa0bb21" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 985.917842] env[60764]: DEBUG nova.network.neutron [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 985.925345] env[60764]: DEBUG nova.compute.manager [None req-4b7f9121-01c4-4ce3-9ed7-ea23702d8d86 tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: 7cef2173-9a2d-4428-81d2-f13b807967c6] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 985.946247] env[60764]: DEBUG nova.network.neutron [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 985.958342] env[60764]: DEBUG nova.compute.manager [None req-4b7f9121-01c4-4ce3-9ed7-ea23702d8d86 tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: 7cef2173-9a2d-4428-81d2-f13b807967c6] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 985.978928] env[60764]: DEBUG oslo_concurrency.lockutils [None req-4b7f9121-01c4-4ce3-9ed7-ea23702d8d86 tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Lock "7cef2173-9a2d-4428-81d2-f13b807967c6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.749s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 985.988370] env[60764]: DEBUG nova.compute.manager [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 986.046029] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 986.046029] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 986.046029] env[60764]: INFO nova.compute.claims [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 986.305749] env[60764]: DEBUG nova.network.neutron [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 986.322099] env[60764]: DEBUG oslo_concurrency.lockutils [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Releasing lock "refresh_cache-d2496c8d-c17c-4178-a2a9-85390aa0bb21" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 986.322518] env[60764]: DEBUG nova.compute.manager [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 986.322704] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 986.323260] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b83e00d6-3a7b-45d3-bdf0-3a76002a46f5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 986.337399] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f79b91d-007b-44cc-9af6-bc2d80e71e7a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 986.365363] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d2496c8d-c17c-4178-a2a9-85390aa0bb21 could not be found. [ 986.365573] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 986.365748] env[60764]: INFO nova.compute.manager [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Took 0.04 seconds to destroy the instance on the hypervisor. [ 986.365998] env[60764]: DEBUG oslo.service.loopingcall [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 986.368671] env[60764]: DEBUG nova.compute.manager [-] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 986.368777] env[60764]: DEBUG nova.network.neutron [-] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 986.388843] env[60764]: DEBUG nova.network.neutron [-] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 986.396411] env[60764]: DEBUG nova.network.neutron [-] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 986.407170] env[60764]: INFO nova.compute.manager [-] [instance: d2496c8d-c17c-4178-a2a9-85390aa0bb21] Took 0.04 seconds to deallocate network for instance. [ 986.509540] env[60764]: DEBUG oslo_concurrency.lockutils [None req-49853511-24db-4564-9e1f-52ccdfaf509f tempest-SecurityGroupsTestJSON-1536132228 tempest-SecurityGroupsTestJSON-1536132228-project-member] Lock "d2496c8d-c17c-4178-a2a9-85390aa0bb21" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.597s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 986.569329] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b61ec29f-fc39-4939-8cc6-60d444bd0f99 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 986.577317] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a38e7964-0bea-4be7-8b69-e46d1d02c758 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 986.608170] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31ea7641-3b2b-490e-9845-4ddfddbb3ddd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 986.615428] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b596036-0825-4958-8e58-d87fd8a0a2c6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 986.628985] env[60764]: DEBUG nova.compute.provider_tree [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 986.637642] env[60764]: DEBUG nova.scheduler.client.report [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 986.654450] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.611s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 986.654927] env[60764]: DEBUG nova.compute.manager [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 986.695951] env[60764]: DEBUG nova.compute.utils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 986.697668] env[60764]: DEBUG nova.compute.manager [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 986.697813] env[60764]: DEBUG nova.network.neutron [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 986.706086] env[60764]: DEBUG nova.compute.manager [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 986.773360] env[60764]: DEBUG nova.policy [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8b4fce735105438d8ecec1bdf8966b0d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5e056495f923431badeff3bed318e8cf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 986.799413] env[60764]: DEBUG nova.compute.manager [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 986.828463] env[60764]: DEBUG nova.virt.hardware [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 986.828724] env[60764]: DEBUG nova.virt.hardware [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 986.828877] env[60764]: DEBUG nova.virt.hardware [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 986.829066] env[60764]: DEBUG nova.virt.hardware [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 986.829209] env[60764]: DEBUG nova.virt.hardware [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 986.831067] env[60764]: DEBUG nova.virt.hardware [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 986.831067] env[60764]: DEBUG nova.virt.hardware [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 986.831067] env[60764]: DEBUG nova.virt.hardware [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 986.831067] env[60764]: DEBUG nova.virt.hardware [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 986.831067] env[60764]: DEBUG nova.virt.hardware [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 986.831221] env[60764]: DEBUG nova.virt.hardware [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 986.831221] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35120090-f755-4f66-8314-5829081f69e5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 986.839522] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-013aa03a-e9e1-49b4-bd86-a0243b1cf0a1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 986.978252] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Acquiring lock "b3ca6987-3415-4db5-a514-cd66c342eb7f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 986.978467] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Lock "b3ca6987-3415-4db5-a514-cd66c342eb7f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 987.022432] env[60764]: DEBUG oslo_concurrency.lockutils [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Acquiring lock "e80dd396-f709-48d7-bc98-159b175f5593" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 987.197232] env[60764]: DEBUG nova.network.neutron [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Successfully created port: ec29421e-046d-4686-bfd0-4f7dc5508d41 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 987.639961] env[60764]: DEBUG nova.network.neutron [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Successfully created port: 8b8858fb-5e02-43ed-b788-274f827ae3cb {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 988.480925] env[60764]: DEBUG nova.compute.manager [req-4f19be92-7537-4be8-8e9a-7814b813fc0f req-c93d8c54-e02f-40d4-9f5d-c5bfaec702b6 service nova] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Received event network-vif-plugged-ec29421e-046d-4686-bfd0-4f7dc5508d41 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 988.481284] env[60764]: DEBUG oslo_concurrency.lockutils [req-4f19be92-7537-4be8-8e9a-7814b813fc0f req-c93d8c54-e02f-40d4-9f5d-c5bfaec702b6 service nova] Acquiring lock "e80dd396-f709-48d7-bc98-159b175f5593-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 988.481661] env[60764]: DEBUG oslo_concurrency.lockutils [req-4f19be92-7537-4be8-8e9a-7814b813fc0f req-c93d8c54-e02f-40d4-9f5d-c5bfaec702b6 service nova] Lock "e80dd396-f709-48d7-bc98-159b175f5593-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 988.481988] env[60764]: DEBUG oslo_concurrency.lockutils [req-4f19be92-7537-4be8-8e9a-7814b813fc0f req-c93d8c54-e02f-40d4-9f5d-c5bfaec702b6 service nova] Lock "e80dd396-f709-48d7-bc98-159b175f5593-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 988.482313] env[60764]: DEBUG nova.compute.manager [req-4f19be92-7537-4be8-8e9a-7814b813fc0f req-c93d8c54-e02f-40d4-9f5d-c5bfaec702b6 service nova] [instance: e80dd396-f709-48d7-bc98-159b175f5593] No waiting events found dispatching network-vif-plugged-ec29421e-046d-4686-bfd0-4f7dc5508d41 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 988.482550] env[60764]: WARNING nova.compute.manager [req-4f19be92-7537-4be8-8e9a-7814b813fc0f req-c93d8c54-e02f-40d4-9f5d-c5bfaec702b6 service nova] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Received unexpected event network-vif-plugged-ec29421e-046d-4686-bfd0-4f7dc5508d41 for instance with vm_state building and task_state deleting. [ 988.528416] env[60764]: DEBUG nova.network.neutron [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Successfully updated port: ec29421e-046d-4686-bfd0-4f7dc5508d41 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 989.756591] env[60764]: DEBUG nova.network.neutron [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Successfully updated port: 8b8858fb-5e02-43ed-b788-274f827ae3cb {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 989.770867] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Acquiring lock "refresh_cache-e80dd396-f709-48d7-bc98-159b175f5593" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 989.770867] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Acquired lock "refresh_cache-e80dd396-f709-48d7-bc98-159b175f5593" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 989.770867] env[60764]: DEBUG nova.network.neutron [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 989.864176] env[60764]: DEBUG nova.network.neutron [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 990.584587] env[60764]: DEBUG nova.network.neutron [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Updating instance_info_cache with network_info: [{"id": "ec29421e-046d-4686-bfd0-4f7dc5508d41", "address": "fa:16:3e:6b:10:08", "network": {"id": "bf4a28e2-f8ac-432d-b183-d17a7b28d5d8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-355930240", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.57", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5e056495f923431badeff3bed318e8cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac2c9d07-ed01-47a9-88f1-562992bc1076", "external-id": "nsx-vlan-transportzone-968", "segmentation_id": 968, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec29421e-04", "ovs_interfaceid": "ec29421e-046d-4686-bfd0-4f7dc5508d41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8b8858fb-5e02-43ed-b788-274f827ae3cb", "address": "fa:16:3e:9b:28:6e", "network": {"id": "a30ae683-0278-4e3c-8817-a9abc862306b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1328207510", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.195", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "5e056495f923431badeff3bed318e8cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0954fad3-d24d-496c-83e6-a09d3cb556fc", "external-id": "nsx-vlan-transportzone-216", "segmentation_id": 216, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8b8858fb-5e", "ovs_interfaceid": "8b8858fb-5e02-43ed-b788-274f827ae3cb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 990.599223] env[60764]: DEBUG nova.compute.manager [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Received event network-changed-ec29421e-046d-4686-bfd0-4f7dc5508d41 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 990.599462] env[60764]: DEBUG nova.compute.manager [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Refreshing instance network info cache due to event network-changed-ec29421e-046d-4686-bfd0-4f7dc5508d41. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 990.599636] env[60764]: DEBUG oslo_concurrency.lockutils [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] Acquiring lock "refresh_cache-e80dd396-f709-48d7-bc98-159b175f5593" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 990.602755] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Releasing lock "refresh_cache-e80dd396-f709-48d7-bc98-159b175f5593" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 990.602878] env[60764]: DEBUG nova.compute.manager [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Instance network_info: |[{"id": "ec29421e-046d-4686-bfd0-4f7dc5508d41", "address": "fa:16:3e:6b:10:08", "network": {"id": "bf4a28e2-f8ac-432d-b183-d17a7b28d5d8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-355930240", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.57", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5e056495f923431badeff3bed318e8cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac2c9d07-ed01-47a9-88f1-562992bc1076", "external-id": "nsx-vlan-transportzone-968", "segmentation_id": 968, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec29421e-04", "ovs_interfaceid": "ec29421e-046d-4686-bfd0-4f7dc5508d41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8b8858fb-5e02-43ed-b788-274f827ae3cb", "address": "fa:16:3e:9b:28:6e", "network": {"id": "a30ae683-0278-4e3c-8817-a9abc862306b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1328207510", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.195", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "5e056495f923431badeff3bed318e8cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0954fad3-d24d-496c-83e6-a09d3cb556fc", "external-id": "nsx-vlan-transportzone-216", "segmentation_id": 216, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8b8858fb-5e", "ovs_interfaceid": "8b8858fb-5e02-43ed-b788-274f827ae3cb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 990.603825] env[60764]: DEBUG oslo_concurrency.lockutils [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] Acquired lock "refresh_cache-e80dd396-f709-48d7-bc98-159b175f5593" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 990.604072] env[60764]: DEBUG nova.network.neutron [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Refreshing network info cache for port ec29421e-046d-4686-bfd0-4f7dc5508d41 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 990.605079] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6b:10:08', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ac2c9d07-ed01-47a9-88f1-562992bc1076', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ec29421e-046d-4686-bfd0-4f7dc5508d41', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:9b:28:6e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0954fad3-d24d-496c-83e6-a09d3cb556fc', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8b8858fb-5e02-43ed-b788-274f827ae3cb', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 990.619417] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Creating folder: Project (5e056495f923431badeff3bed318e8cf). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 990.623621] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7aa8f4ab-f656-4ba7-8f93-882c0c424303 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 990.635637] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Created folder: Project (5e056495f923431badeff3bed318e8cf) in parent group-v449629. [ 990.635844] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Creating folder: Instances. Parent ref: group-v449694. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 990.636099] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f584c99b-92a1-492c-af30-98cdf67b46c8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 990.646233] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Created folder: Instances in parent group-v449694. [ 990.646345] env[60764]: DEBUG oslo.service.loopingcall [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 990.646543] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 990.646749] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e3b56b0e-560c-47e0-a80a-c5c5e52d35fc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 990.673911] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 990.673911] env[60764]: value = "task-2204939" [ 990.673911] env[60764]: _type = "Task" [ 990.673911] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 990.682381] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204939, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 991.187172] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204939, 'name': CreateVM_Task, 'duration_secs': 0.387132} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 991.187520] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 991.188672] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 991.188872] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 991.190856] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 991.190856] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b70195a7-d838-4c42-b68d-477853a118b0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 991.194689] env[60764]: DEBUG oslo_vmware.api [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Waiting for the task: (returnval){ [ 991.194689] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5239ff77-cc3c-36db-e347-357ceda9426c" [ 991.194689] env[60764]: _type = "Task" [ 991.194689] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 991.206975] env[60764]: DEBUG oslo_vmware.api [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5239ff77-cc3c-36db-e347-357ceda9426c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 991.268507] env[60764]: DEBUG nova.network.neutron [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Updated VIF entry in instance network info cache for port ec29421e-046d-4686-bfd0-4f7dc5508d41. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 991.268910] env[60764]: DEBUG nova.network.neutron [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Updating instance_info_cache with network_info: [{"id": "ec29421e-046d-4686-bfd0-4f7dc5508d41", "address": "fa:16:3e:6b:10:08", "network": {"id": "bf4a28e2-f8ac-432d-b183-d17a7b28d5d8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-355930240", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.57", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5e056495f923431badeff3bed318e8cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac2c9d07-ed01-47a9-88f1-562992bc1076", "external-id": "nsx-vlan-transportzone-968", "segmentation_id": 968, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec29421e-04", "ovs_interfaceid": "ec29421e-046d-4686-bfd0-4f7dc5508d41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8b8858fb-5e02-43ed-b788-274f827ae3cb", "address": "fa:16:3e:9b:28:6e", "network": {"id": "a30ae683-0278-4e3c-8817-a9abc862306b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1328207510", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.195", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "5e056495f923431badeff3bed318e8cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0954fad3-d24d-496c-83e6-a09d3cb556fc", "external-id": "nsx-vlan-transportzone-216", "segmentation_id": 216, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8b8858fb-5e", "ovs_interfaceid": "8b8858fb-5e02-43ed-b788-274f827ae3cb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 991.280911] env[60764]: DEBUG oslo_concurrency.lockutils [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] Releasing lock "refresh_cache-e80dd396-f709-48d7-bc98-159b175f5593" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 991.281170] env[60764]: DEBUG nova.compute.manager [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Received event network-vif-plugged-8b8858fb-5e02-43ed-b788-274f827ae3cb {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 991.281358] env[60764]: DEBUG oslo_concurrency.lockutils [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] Acquiring lock "e80dd396-f709-48d7-bc98-159b175f5593-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 991.281585] env[60764]: DEBUG oslo_concurrency.lockutils [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] Lock "e80dd396-f709-48d7-bc98-159b175f5593-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 991.281745] env[60764]: DEBUG oslo_concurrency.lockutils [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] Lock "e80dd396-f709-48d7-bc98-159b175f5593-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 991.281901] env[60764]: DEBUG nova.compute.manager [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] [instance: e80dd396-f709-48d7-bc98-159b175f5593] No waiting events found dispatching network-vif-plugged-8b8858fb-5e02-43ed-b788-274f827ae3cb {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 991.282073] env[60764]: WARNING nova.compute.manager [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Received unexpected event network-vif-plugged-8b8858fb-5e02-43ed-b788-274f827ae3cb for instance with vm_state building and task_state deleting. [ 991.282227] env[60764]: DEBUG nova.compute.manager [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Received event network-changed-8b8858fb-5e02-43ed-b788-274f827ae3cb {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 991.282372] env[60764]: DEBUG nova.compute.manager [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Refreshing instance network info cache due to event network-changed-8b8858fb-5e02-43ed-b788-274f827ae3cb. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 991.282549] env[60764]: DEBUG oslo_concurrency.lockutils [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] Acquiring lock "refresh_cache-e80dd396-f709-48d7-bc98-159b175f5593" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 991.282678] env[60764]: DEBUG oslo_concurrency.lockutils [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] Acquired lock "refresh_cache-e80dd396-f709-48d7-bc98-159b175f5593" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 991.282825] env[60764]: DEBUG nova.network.neutron [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Refreshing network info cache for port 8b8858fb-5e02-43ed-b788-274f827ae3cb {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 991.646282] env[60764]: DEBUG nova.network.neutron [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Updated VIF entry in instance network info cache for port 8b8858fb-5e02-43ed-b788-274f827ae3cb. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 991.646282] env[60764]: DEBUG nova.network.neutron [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Updating instance_info_cache with network_info: [{"id": "ec29421e-046d-4686-bfd0-4f7dc5508d41", "address": "fa:16:3e:6b:10:08", "network": {"id": "bf4a28e2-f8ac-432d-b183-d17a7b28d5d8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-355930240", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.57", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5e056495f923431badeff3bed318e8cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac2c9d07-ed01-47a9-88f1-562992bc1076", "external-id": "nsx-vlan-transportzone-968", "segmentation_id": 968, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec29421e-04", "ovs_interfaceid": "ec29421e-046d-4686-bfd0-4f7dc5508d41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8b8858fb-5e02-43ed-b788-274f827ae3cb", "address": "fa:16:3e:9b:28:6e", "network": {"id": "a30ae683-0278-4e3c-8817-a9abc862306b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1328207510", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.195", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "5e056495f923431badeff3bed318e8cf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0954fad3-d24d-496c-83e6-a09d3cb556fc", "external-id": "nsx-vlan-transportzone-216", "segmentation_id": 216, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8b8858fb-5e", "ovs_interfaceid": "8b8858fb-5e02-43ed-b788-274f827ae3cb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 991.658696] env[60764]: DEBUG oslo_concurrency.lockutils [req-28cd6739-6f57-4925-b97b-1949240fe0e1 req-d23baaad-817a-4d0d-abc9-a49be78b2733 service nova] Releasing lock "refresh_cache-e80dd396-f709-48d7-bc98-159b175f5593" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 991.708162] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 991.708423] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 991.708638] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 999.381434] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d33e9f4f-2bfa-4c01-be94-c640621d7d8a tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Acquiring lock "26c4ebb8-f581-4c06-8000-c80fa09a2d27" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 999.381696] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d33e9f4f-2bfa-4c01-be94-c640621d7d8a tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "26c4ebb8-f581-4c06-8000-c80fa09a2d27" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1001.190658] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9a71ea8c-1078-4427-9be8-e2bdd9010d95 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Acquiring lock "d8e88a52-0f72-4824-9ab5-3ebc8b3509bb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1001.190921] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9a71ea8c-1078-4427-9be8-e2bdd9010d95 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "d8e88a52-0f72-4824-9ab5-3ebc8b3509bb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1001.888588] env[60764]: DEBUG oslo_concurrency.lockutils [None req-492806f5-08d6-463e-bf4d-04d890dd7cdf tempest-AttachVolumeNegativeTest-1914834048 tempest-AttachVolumeNegativeTest-1914834048-project-member] Acquiring lock "75218609-125a-4cb1-90c8-8a508951d9a9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1001.888821] env[60764]: DEBUG oslo_concurrency.lockutils [None req-492806f5-08d6-463e-bf4d-04d890dd7cdf tempest-AttachVolumeNegativeTest-1914834048 tempest-AttachVolumeNegativeTest-1914834048-project-member] Lock "75218609-125a-4cb1-90c8-8a508951d9a9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1013.437162] env[60764]: DEBUG oslo_concurrency.lockutils [None req-89f2f3f7-dd74-4003-969e-c689003b239d tempest-ServerTagsTestJSON-125331092 tempest-ServerTagsTestJSON-125331092-project-member] Acquiring lock "c5cfe38e-479e-4823-8ed7-2de3f31f47f7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1013.437162] env[60764]: DEBUG oslo_concurrency.lockutils [None req-89f2f3f7-dd74-4003-969e-c689003b239d tempest-ServerTagsTestJSON-125331092 tempest-ServerTagsTestJSON-125331092-project-member] Lock "c5cfe38e-479e-4823-8ed7-2de3f31f47f7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1030.707582] env[60764]: WARNING oslo_vmware.rw_handles [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1030.707582] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1030.707582] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1030.707582] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1030.707582] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1030.707582] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1030.707582] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1030.707582] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1030.707582] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1030.707582] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1030.707582] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1030.707582] env[60764]: ERROR oslo_vmware.rw_handles [ 1030.708192] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/c093d699-fb0d-4453-8498-d8c94f095702/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1030.710842] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1030.711163] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Copying Virtual Disk [datastore2] vmware_temp/c093d699-fb0d-4453-8498-d8c94f095702/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/c093d699-fb0d-4453-8498-d8c94f095702/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1030.711505] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a187968a-fca7-49d3-992c-84a61e9d1a50 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.720065] env[60764]: DEBUG oslo_vmware.api [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Waiting for the task: (returnval){ [ 1030.720065] env[60764]: value = "task-2204940" [ 1030.720065] env[60764]: _type = "Task" [ 1030.720065] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1030.728372] env[60764]: DEBUG oslo_vmware.api [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Task: {'id': task-2204940, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1031.230265] env[60764]: DEBUG oslo_vmware.exceptions [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1031.230545] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1031.231127] env[60764]: ERROR nova.compute.manager [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1031.231127] env[60764]: Faults: ['InvalidArgument'] [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Traceback (most recent call last): [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] yield resources [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] self.driver.spawn(context, instance, image_meta, [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] self._fetch_image_if_missing(context, vi) [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] image_cache(vi, tmp_image_ds_loc) [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] vm_util.copy_virtual_disk( [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] session._wait_for_task(vmdk_copy_task) [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] return self.wait_for_task(task_ref) [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] return evt.wait() [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] result = hub.switch() [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] return self.greenlet.switch() [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] self.f(*self.args, **self.kw) [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] raise exceptions.translate_fault(task_info.error) [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Faults: ['InvalidArgument'] [ 1031.231127] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] [ 1031.232112] env[60764]: INFO nova.compute.manager [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Terminating instance [ 1031.233048] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1031.233270] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1031.233512] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1a95079c-0d95-4fbd-9c91-7c19b11f6915 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1031.237045] env[60764]: DEBUG nova.compute.manager [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1031.237045] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1031.238031] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f867a93-b8eb-4df6-a356-b2d40e32f4fd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1031.244349] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1031.244642] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5d0e763a-040f-411c-90ad-07e7f4cf38be {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1031.246907] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1031.247092] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1031.248084] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-85f60186-83c1-4c59-a308-eaeb5679ed8e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1031.253147] env[60764]: DEBUG oslo_vmware.api [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Waiting for the task: (returnval){ [ 1031.253147] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]521f9502-ca70-2975-46d9-bbf51509dab8" [ 1031.253147] env[60764]: _type = "Task" [ 1031.253147] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1031.264296] env[60764]: DEBUG oslo_vmware.api [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]521f9502-ca70-2975-46d9-bbf51509dab8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1031.314945] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1031.315200] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1031.315381] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Deleting the datastore file [datastore2] 7957eb49-d540-4c4e-a86a-1ea3631fb5ef {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1031.315694] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e6508636-5768-47e7-9487-73b7e8461298 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1031.321804] env[60764]: DEBUG oslo_vmware.api [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Waiting for the task: (returnval){ [ 1031.321804] env[60764]: value = "task-2204942" [ 1031.321804] env[60764]: _type = "Task" [ 1031.321804] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1031.329672] env[60764]: DEBUG oslo_vmware.api [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Task: {'id': task-2204942, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1031.764092] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1031.764399] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Creating directory with path [datastore2] vmware_temp/76557603-34f9-41f3-8463-c54d466eac02/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1031.764641] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a2587e72-ac8b-40c6-9bde-936493228f8a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1031.775713] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Created directory with path [datastore2] vmware_temp/76557603-34f9-41f3-8463-c54d466eac02/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1031.775900] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Fetch image to [datastore2] vmware_temp/76557603-34f9-41f3-8463-c54d466eac02/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1031.776081] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/76557603-34f9-41f3-8463-c54d466eac02/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1031.776795] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bcf85af-1b40-4d3f-9210-0836a2ca914d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1031.784808] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-030c1af3-0612-4a5c-bc13-661232e4e24c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1031.793730] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cfaa104-b78c-4afe-8032-c6bf340a28f1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1031.828056] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2ee4e27-0e6c-465f-b002-b767eb2e50f1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1031.834553] env[60764]: DEBUG oslo_vmware.api [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Task: {'id': task-2204942, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068542} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1031.836024] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1031.836222] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1031.836485] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1031.836684] env[60764]: INFO nova.compute.manager [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1031.838750] env[60764]: DEBUG nova.compute.claims [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1031.838918] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1031.839154] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1031.841789] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b1f1aa1a-cb42-4d35-b493-6ed144bf7af5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1031.865282] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1031.925957] env[60764]: DEBUG oslo_vmware.rw_handles [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/76557603-34f9-41f3-8463-c54d466eac02/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1031.988716] env[60764]: DEBUG oslo_vmware.rw_handles [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1031.989300] env[60764]: DEBUG oslo_vmware.rw_handles [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/76557603-34f9-41f3-8463-c54d466eac02/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1032.232384] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eeda1841-8f33-4452-b7c9-806f7ad9c79f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1032.240547] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a3cee17-92b8-46cb-8d52-528c55fd6e1f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1032.270455] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-113530f4-a638-47c7-8384-9a47c4a06810 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1032.277517] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18320427-f7be-4b06-bc31-aa172afe52fa {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1032.290492] env[60764]: DEBUG nova.compute.provider_tree [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1032.298837] env[60764]: DEBUG nova.scheduler.client.report [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1032.313139] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.474s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1032.313716] env[60764]: ERROR nova.compute.manager [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1032.313716] env[60764]: Faults: ['InvalidArgument'] [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Traceback (most recent call last): [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] self.driver.spawn(context, instance, image_meta, [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] self._fetch_image_if_missing(context, vi) [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] image_cache(vi, tmp_image_ds_loc) [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] vm_util.copy_virtual_disk( [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] session._wait_for_task(vmdk_copy_task) [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] return self.wait_for_task(task_ref) [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] return evt.wait() [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] result = hub.switch() [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] return self.greenlet.switch() [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] self.f(*self.args, **self.kw) [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] raise exceptions.translate_fault(task_info.error) [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Faults: ['InvalidArgument'] [ 1032.313716] env[60764]: ERROR nova.compute.manager [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] [ 1032.314574] env[60764]: DEBUG nova.compute.utils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1032.317037] env[60764]: DEBUG nova.compute.manager [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Build of instance 7957eb49-d540-4c4e-a86a-1ea3631fb5ef was re-scheduled: A specified parameter was not correct: fileType [ 1032.317037] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1032.317578] env[60764]: DEBUG nova.compute.manager [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1032.317652] env[60764]: DEBUG nova.compute.manager [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1032.317775] env[60764]: DEBUG nova.compute.manager [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1032.317935] env[60764]: DEBUG nova.network.neutron [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1032.330043] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1032.330142] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1032.330218] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1032.353111] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b846d9ae-759a-4898-9ede-091819325701] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.353416] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.353587] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.353772] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.354050] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.354163] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.354288] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.354410] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.354529] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1032.354645] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1032.697878] env[60764]: DEBUG nova.network.neutron [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1032.710411] env[60764]: INFO nova.compute.manager [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Took 0.39 seconds to deallocate network for instance. [ 1032.819869] env[60764]: INFO nova.scheduler.client.report [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Deleted allocations for instance 7957eb49-d540-4c4e-a86a-1ea3631fb5ef [ 1032.844024] env[60764]: DEBUG oslo_concurrency.lockutils [None req-223c91b1-9e08-4cea-863d-d103b74ada01 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Lock "7957eb49-d540-4c4e-a86a-1ea3631fb5ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 479.572s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1032.848335] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e0dfa79d-337e-438b-bfb2-4c621b3cd973 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Lock "7957eb49-d540-4c4e-a86a-1ea3631fb5ef" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 281.570s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1032.848335] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e0dfa79d-337e-438b-bfb2-4c621b3cd973 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Acquiring lock "7957eb49-d540-4c4e-a86a-1ea3631fb5ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1032.848335] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e0dfa79d-337e-438b-bfb2-4c621b3cd973 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Lock "7957eb49-d540-4c4e-a86a-1ea3631fb5ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1032.848335] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e0dfa79d-337e-438b-bfb2-4c621b3cd973 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Lock "7957eb49-d540-4c4e-a86a-1ea3631fb5ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1032.848335] env[60764]: INFO nova.compute.manager [None req-e0dfa79d-337e-438b-bfb2-4c621b3cd973 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Terminating instance [ 1032.849745] env[60764]: DEBUG nova.compute.manager [None req-e0dfa79d-337e-438b-bfb2-4c621b3cd973 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1032.849937] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e0dfa79d-337e-438b-bfb2-4c621b3cd973 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1032.850558] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f3292894-c03c-460e-b8af-68b29763bc8e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1032.859804] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bde39d2-dd5b-47bd-9f09-574b89711a27 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1032.875019] env[60764]: DEBUG nova.compute.manager [None req-6d13e3cf-d904-45ff-b7b3-87e87d8212be tempest-ListServersNegativeTestJSON-153891157 tempest-ListServersNegativeTestJSON-153891157-project-member] [instance: b2e4096c-0edc-44ac-a4b6-3a32e0466c54] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1032.892265] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-e0dfa79d-337e-438b-bfb2-4c621b3cd973 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7957eb49-d540-4c4e-a86a-1ea3631fb5ef could not be found. [ 1032.892478] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e0dfa79d-337e-438b-bfb2-4c621b3cd973 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1032.892654] env[60764]: INFO nova.compute.manager [None req-e0dfa79d-337e-438b-bfb2-4c621b3cd973 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1032.892891] env[60764]: DEBUG oslo.service.loopingcall [None req-e0dfa79d-337e-438b-bfb2-4c621b3cd973 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1032.893128] env[60764]: DEBUG nova.compute.manager [-] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1032.893228] env[60764]: DEBUG nova.network.neutron [-] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1032.897204] env[60764]: DEBUG nova.compute.manager [None req-6d13e3cf-d904-45ff-b7b3-87e87d8212be tempest-ListServersNegativeTestJSON-153891157 tempest-ListServersNegativeTestJSON-153891157-project-member] [instance: b2e4096c-0edc-44ac-a4b6-3a32e0466c54] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1032.919924] env[60764]: DEBUG oslo_concurrency.lockutils [None req-6d13e3cf-d904-45ff-b7b3-87e87d8212be tempest-ListServersNegativeTestJSON-153891157 tempest-ListServersNegativeTestJSON-153891157-project-member] Lock "b2e4096c-0edc-44ac-a4b6-3a32e0466c54" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 240.144s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1032.929243] env[60764]: DEBUG nova.network.neutron [-] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1032.930954] env[60764]: DEBUG nova.compute.manager [None req-6d13e3cf-d904-45ff-b7b3-87e87d8212be tempest-ListServersNegativeTestJSON-153891157 tempest-ListServersNegativeTestJSON-153891157-project-member] [instance: 30aca955-c304-43e8-8da1-91cd7f4e1b38] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1032.937017] env[60764]: INFO nova.compute.manager [-] [instance: 7957eb49-d540-4c4e-a86a-1ea3631fb5ef] Took 0.04 seconds to deallocate network for instance. [ 1032.959426] env[60764]: DEBUG nova.compute.manager [None req-6d13e3cf-d904-45ff-b7b3-87e87d8212be tempest-ListServersNegativeTestJSON-153891157 tempest-ListServersNegativeTestJSON-153891157-project-member] [instance: 30aca955-c304-43e8-8da1-91cd7f4e1b38] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1032.986808] env[60764]: DEBUG oslo_concurrency.lockutils [None req-6d13e3cf-d904-45ff-b7b3-87e87d8212be tempest-ListServersNegativeTestJSON-153891157 tempest-ListServersNegativeTestJSON-153891157-project-member] Lock "30aca955-c304-43e8-8da1-91cd7f4e1b38" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 240.158s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1032.998392] env[60764]: DEBUG nova.compute.manager [None req-6d13e3cf-d904-45ff-b7b3-87e87d8212be tempest-ListServersNegativeTestJSON-153891157 tempest-ListServersNegativeTestJSON-153891157-project-member] [instance: c5627bb3-36c8-415f-bf4e-449adedd5ba6] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1033.023221] env[60764]: DEBUG nova.compute.manager [None req-6d13e3cf-d904-45ff-b7b3-87e87d8212be tempest-ListServersNegativeTestJSON-153891157 tempest-ListServersNegativeTestJSON-153891157-project-member] [instance: c5627bb3-36c8-415f-bf4e-449adedd5ba6] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1033.047726] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e0dfa79d-337e-438b-bfb2-4c621b3cd973 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Lock "7957eb49-d540-4c4e-a86a-1ea3631fb5ef" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.202s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1033.049329] env[60764]: DEBUG oslo_concurrency.lockutils [None req-6d13e3cf-d904-45ff-b7b3-87e87d8212be tempest-ListServersNegativeTestJSON-153891157 tempest-ListServersNegativeTestJSON-153891157-project-member] Lock "c5627bb3-36c8-415f-bf4e-449adedd5ba6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 240.190s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1033.056835] env[60764]: DEBUG nova.compute.manager [None req-2538a5b9-d16f-4988-a8ca-38e30f5cd1e0 tempest-ServerRescueNegativeTestJSON-251426708 tempest-ServerRescueNegativeTestJSON-251426708-project-member] [instance: 5bd8caaf-f61b-4df2-abd0-3da5259ae829] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1033.078321] env[60764]: DEBUG nova.compute.manager [None req-2538a5b9-d16f-4988-a8ca-38e30f5cd1e0 tempest-ServerRescueNegativeTestJSON-251426708 tempest-ServerRescueNegativeTestJSON-251426708-project-member] [instance: 5bd8caaf-f61b-4df2-abd0-3da5259ae829] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1033.097325] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2538a5b9-d16f-4988-a8ca-38e30f5cd1e0 tempest-ServerRescueNegativeTestJSON-251426708 tempest-ServerRescueNegativeTestJSON-251426708-project-member] Lock "5bd8caaf-f61b-4df2-abd0-3da5259ae829" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.909s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1033.108867] env[60764]: DEBUG nova.compute.manager [None req-e6da2882-edb7-438f-bc6b-a2a357fc7855 tempest-ServersNegativeTestJSON-1968257794 tempest-ServersNegativeTestJSON-1968257794-project-member] [instance: 9d4328ec-dab2-41c8-88da-82df6f2ae17f] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1033.130616] env[60764]: DEBUG nova.compute.manager [None req-e6da2882-edb7-438f-bc6b-a2a357fc7855 tempest-ServersNegativeTestJSON-1968257794 tempest-ServersNegativeTestJSON-1968257794-project-member] [instance: 9d4328ec-dab2-41c8-88da-82df6f2ae17f] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1033.149919] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e6da2882-edb7-438f-bc6b-a2a357fc7855 tempest-ServersNegativeTestJSON-1968257794 tempest-ServersNegativeTestJSON-1968257794-project-member] Lock "9d4328ec-dab2-41c8-88da-82df6f2ae17f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.099s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1033.158382] env[60764]: DEBUG nova.compute.manager [None req-1da3d1bd-c45c-457f-bd84-36780e49f390 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 12c9b68c-740c-4555-915d-b23c8b5f0473] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1033.180944] env[60764]: DEBUG nova.compute.manager [None req-1da3d1bd-c45c-457f-bd84-36780e49f390 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 12c9b68c-740c-4555-915d-b23c8b5f0473] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1033.200806] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1da3d1bd-c45c-457f-bd84-36780e49f390 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "12c9b68c-740c-4555-915d-b23c8b5f0473" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.019s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1033.209208] env[60764]: DEBUG nova.compute.manager [None req-34dc3448-b626-4cd5-a127-0c76cccc6a52 tempest-ServerRescueNegativeTestJSON-251426708 tempest-ServerRescueNegativeTestJSON-251426708-project-member] [instance: d7a99b7b-faa7-4904-9841-0ea582af96ca] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1033.233353] env[60764]: DEBUG nova.compute.manager [None req-34dc3448-b626-4cd5-a127-0c76cccc6a52 tempest-ServerRescueNegativeTestJSON-251426708 tempest-ServerRescueNegativeTestJSON-251426708-project-member] [instance: d7a99b7b-faa7-4904-9841-0ea582af96ca] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1033.253511] env[60764]: DEBUG oslo_concurrency.lockutils [None req-34dc3448-b626-4cd5-a127-0c76cccc6a52 tempest-ServerRescueNegativeTestJSON-251426708 tempest-ServerRescueNegativeTestJSON-251426708-project-member] Lock "d7a99b7b-faa7-4904-9841-0ea582af96ca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 231.184s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1033.263664] env[60764]: DEBUG nova.compute.manager [None req-1d6b85a5-02f3-4f91-94d9-606618de7fd0 tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] [instance: 8cd907ae-697b-4fe3-86d4-e2e9f38ae424] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1033.285196] env[60764]: DEBUG nova.compute.manager [None req-1d6b85a5-02f3-4f91-94d9-606618de7fd0 tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] [instance: 8cd907ae-697b-4fe3-86d4-e2e9f38ae424] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1033.304654] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d6b85a5-02f3-4f91-94d9-606618de7fd0 tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] Lock "8cd907ae-697b-4fe3-86d4-e2e9f38ae424" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.564s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1033.316054] env[60764]: DEBUG nova.compute.manager [None req-1d6b85a5-02f3-4f91-94d9-606618de7fd0 tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] [instance: 67b9f11f-bb3f-43d6-bed7-54cbb2fd5e82] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1033.339286] env[60764]: DEBUG nova.compute.manager [None req-1d6b85a5-02f3-4f91-94d9-606618de7fd0 tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] [instance: 67b9f11f-bb3f-43d6-bed7-54cbb2fd5e82] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1033.362747] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d6b85a5-02f3-4f91-94d9-606618de7fd0 tempest-MultipleCreateTestJSON-870240710 tempest-MultipleCreateTestJSON-870240710-project-member] Lock "67b9f11f-bb3f-43d6-bed7-54cbb2fd5e82" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.595s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1033.374720] env[60764]: DEBUG nova.compute.manager [None req-0a8e08a0-9b08-4555-a386-495f6c8485d4 tempest-InstanceActionsTestJSON-1306287903 tempest-InstanceActionsTestJSON-1306287903-project-member] [instance: 3b0f00e3-207a-416b-b971-c687df536a85] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1033.398103] env[60764]: DEBUG nova.compute.manager [None req-0a8e08a0-9b08-4555-a386-495f6c8485d4 tempest-InstanceActionsTestJSON-1306287903 tempest-InstanceActionsTestJSON-1306287903-project-member] [instance: 3b0f00e3-207a-416b-b971-c687df536a85] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1033.419014] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0a8e08a0-9b08-4555-a386-495f6c8485d4 tempest-InstanceActionsTestJSON-1306287903 tempest-InstanceActionsTestJSON-1306287903-project-member] Lock "3b0f00e3-207a-416b-b971-c687df536a85" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 223.183s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1033.428749] env[60764]: DEBUG nova.compute.manager [None req-1096b544-0c1f-421d-a938-00b228fb253e tempest-ServerMetadataNegativeTestJSON-533834650 tempest-ServerMetadataNegativeTestJSON-533834650-project-member] [instance: f926a47b-5252-41a9-9987-027858179887] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1033.451723] env[60764]: DEBUG nova.compute.manager [None req-1096b544-0c1f-421d-a938-00b228fb253e tempest-ServerMetadataNegativeTestJSON-533834650 tempest-ServerMetadataNegativeTestJSON-533834650-project-member] [instance: f926a47b-5252-41a9-9987-027858179887] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1033.475248] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1096b544-0c1f-421d-a938-00b228fb253e tempest-ServerMetadataNegativeTestJSON-533834650 tempest-ServerMetadataNegativeTestJSON-533834650-project-member] Lock "f926a47b-5252-41a9-9987-027858179887" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.591s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1033.513999] env[60764]: DEBUG nova.compute.manager [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1033.575039] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1033.575311] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1033.576856] env[60764]: INFO nova.compute.claims [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1033.986402] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2514180-882b-45fb-bfda-4bc557a29890 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1033.994428] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-370ba59f-bef5-4989-b855-2e5d288e3bc0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1034.024556] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a2f3473-a07c-43da-89b7-ab69c4a070b5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1034.032327] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6641969d-c74b-4568-a12f-984b10cdfa74 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1034.046280] env[60764]: DEBUG nova.compute.provider_tree [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1034.055152] env[60764]: DEBUG nova.scheduler.client.report [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1034.072227] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.497s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1034.072717] env[60764]: DEBUG nova.compute.manager [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1034.103685] env[60764]: DEBUG nova.compute.utils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1034.105215] env[60764]: DEBUG nova.compute.manager [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1034.105830] env[60764]: DEBUG nova.network.neutron [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1034.114112] env[60764]: DEBUG nova.compute.manager [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1034.166582] env[60764]: DEBUG nova.policy [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c85f0e5ebba9460aa95283a6a911a789', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0ae8688b30f34ef8b7b8149cd9ab113b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1034.179971] env[60764]: DEBUG nova.compute.manager [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1034.204605] env[60764]: DEBUG nova.virt.hardware [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1034.204829] env[60764]: DEBUG nova.virt.hardware [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1034.204982] env[60764]: DEBUG nova.virt.hardware [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1034.205174] env[60764]: DEBUG nova.virt.hardware [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1034.205385] env[60764]: DEBUG nova.virt.hardware [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1034.205506] env[60764]: DEBUG nova.virt.hardware [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1034.205720] env[60764]: DEBUG nova.virt.hardware [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1034.205871] env[60764]: DEBUG nova.virt.hardware [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1034.206045] env[60764]: DEBUG nova.virt.hardware [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1034.206209] env[60764]: DEBUG nova.virt.hardware [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1034.206396] env[60764]: DEBUG nova.virt.hardware [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1034.207264] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99ae52a7-abd1-4bc2-92e3-fda88f0b2fec {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1034.214920] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-026004a3-a48b-4419-bc1a-52d66b96b5b8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1034.329955] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1034.343634] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1034.343848] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1034.344016] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1034.344180] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1034.345253] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-579f47de-94a9-4234-83ee-0da624d187ce {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1034.353863] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b4d009a-aa2a-416f-a687-4305709aa3de {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1034.367343] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-298402bf-c96e-495e-940e-b45c6a5a7005 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1034.373512] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7627582b-a208-4ec6-9691-1794889c4ca2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1034.403164] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181270MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1034.403323] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1034.403514] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1034.474434] env[60764]: DEBUG nova.network.neutron [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Successfully created port: 3bbd8469-e08b-4049-9a9e-27f819f552e2 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1034.491728] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b846d9ae-759a-4898-9ede-091819325701 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1034.491728] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1034.491728] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 437e0c0d-6d0e-4465-9651-14e420b646ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1034.491728] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 1f11c625-166f-4609-badf-da4dd9475c37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1034.491728] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6af62a04-4a13-46c7-a0b2-28768c789f23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1034.491728] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7a843233-c56c-4d87-aeb0-2ffaa441b021 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1034.491728] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 74b4bba7-8568-4fc4-a744-395a3271abc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1034.491728] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance dfd3e3af-90c9-420b-81ec-e9115c519016 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1034.491728] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e80dd396-f709-48d7-bc98-159b175f5593 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1034.492207] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ce8f8161-623c-4f88-8846-8f3b5a4ceabe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1034.512778] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 9c01648d-4b7d-460a-b73d-e7324cf251e8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1034.525669] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aad42e7f-24c2-400e-8a1c-6baae2081e29 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1034.545024] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1034.561393] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 594e3624-e282-4695-a6a7-88ab1e2ddfff has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1034.574257] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ba8d2109-e600-4992-b997-c998ae288b59 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1034.587502] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance d8838301-49a7-4291-8091-6fc90fabc7bb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1034.604571] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 69874f31-5316-4a8e-be6c-f77ac9f6ffbc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1034.616456] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 919607ac-6116-49a7-a575-aff30f9e4c86 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1034.628811] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c0c847ad-0c16-4796-a28c-9efdd19b7096 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1034.640053] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 33f526a8-730d-4264-b24a-1dd892343a15 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1034.651207] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f7cbb5ac-1fcf-457a-adea-bce8e9765699 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1034.663230] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 74709301-6eae-40c1-b987-4be9262ef7ce has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1034.682158] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b3ca6987-3415-4db5-a514-cd66c342eb7f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1034.695455] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance d8e88a52-0f72-4824-9ab5-3ebc8b3509bb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1034.708806] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 75218609-125a-4cb1-90c8-8a508951d9a9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1034.720175] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c5cfe38e-479e-4823-8ed7-2de3f31f47f7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1034.720382] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1034.720426] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1035.056677] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ec6a716-18f6-449c-ae03-b73eef1581a2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1035.064284] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59d09bc1-a2b1-4334-8757-1591e13f0553 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1035.095768] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9afec2c-708f-404f-a215-a138c8af3895 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1035.103785] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f72b40f5-6fc6-45b0-9a29-4d5ced58045d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1035.118084] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1035.129012] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1035.143552] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1035.143710] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.740s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1035.226138] env[60764]: DEBUG nova.compute.manager [req-d72430c7-7fa5-4ac8-b3bf-522280e3c31f req-1aa932c2-e082-42a7-9c9d-99ca38ee77aa service nova] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Received event network-vif-plugged-3bbd8469-e08b-4049-9a9e-27f819f552e2 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1035.226138] env[60764]: DEBUG oslo_concurrency.lockutils [req-d72430c7-7fa5-4ac8-b3bf-522280e3c31f req-1aa932c2-e082-42a7-9c9d-99ca38ee77aa service nova] Acquiring lock "ce8f8161-623c-4f88-8846-8f3b5a4ceabe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1035.226138] env[60764]: DEBUG oslo_concurrency.lockutils [req-d72430c7-7fa5-4ac8-b3bf-522280e3c31f req-1aa932c2-e082-42a7-9c9d-99ca38ee77aa service nova] Lock "ce8f8161-623c-4f88-8846-8f3b5a4ceabe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1035.226138] env[60764]: DEBUG oslo_concurrency.lockutils [req-d72430c7-7fa5-4ac8-b3bf-522280e3c31f req-1aa932c2-e082-42a7-9c9d-99ca38ee77aa service nova] Lock "ce8f8161-623c-4f88-8846-8f3b5a4ceabe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1035.226138] env[60764]: DEBUG nova.compute.manager [req-d72430c7-7fa5-4ac8-b3bf-522280e3c31f req-1aa932c2-e082-42a7-9c9d-99ca38ee77aa service nova] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] No waiting events found dispatching network-vif-plugged-3bbd8469-e08b-4049-9a9e-27f819f552e2 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1035.226138] env[60764]: WARNING nova.compute.manager [req-d72430c7-7fa5-4ac8-b3bf-522280e3c31f req-1aa932c2-e082-42a7-9c9d-99ca38ee77aa service nova] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Received unexpected event network-vif-plugged-3bbd8469-e08b-4049-9a9e-27f819f552e2 for instance with vm_state building and task_state spawning. [ 1035.301715] env[60764]: DEBUG nova.network.neutron [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Successfully updated port: 3bbd8469-e08b-4049-9a9e-27f819f552e2 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1035.315890] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Acquiring lock "refresh_cache-ce8f8161-623c-4f88-8846-8f3b5a4ceabe" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1035.316049] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Acquired lock "refresh_cache-ce8f8161-623c-4f88-8846-8f3b5a4ceabe" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1035.316199] env[60764]: DEBUG nova.network.neutron [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1035.352405] env[60764]: DEBUG nova.network.neutron [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1035.520084] env[60764]: DEBUG nova.network.neutron [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Updating instance_info_cache with network_info: [{"id": "3bbd8469-e08b-4049-9a9e-27f819f552e2", "address": "fa:16:3e:bb:0f:83", "network": {"id": "99317aa1-57d1-4a75-b1d6-c09ab320506e", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-29667545-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0ae8688b30f34ef8b7b8149cd9ab113b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3bbd8469-e0", "ovs_interfaceid": "3bbd8469-e08b-4049-9a9e-27f819f552e2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1035.530741] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Releasing lock "refresh_cache-ce8f8161-623c-4f88-8846-8f3b5a4ceabe" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1035.531047] env[60764]: DEBUG nova.compute.manager [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Instance network_info: |[{"id": "3bbd8469-e08b-4049-9a9e-27f819f552e2", "address": "fa:16:3e:bb:0f:83", "network": {"id": "99317aa1-57d1-4a75-b1d6-c09ab320506e", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-29667545-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0ae8688b30f34ef8b7b8149cd9ab113b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3bbd8469-e0", "ovs_interfaceid": "3bbd8469-e08b-4049-9a9e-27f819f552e2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1035.531437] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:bb:0f:83', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f3d7e184-c87f-47a5-8d0d-9fa20e07e669', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3bbd8469-e08b-4049-9a9e-27f819f552e2', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1035.539137] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Creating folder: Project (0ae8688b30f34ef8b7b8149cd9ab113b). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1035.540056] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c0dba8d9-bff7-45a4-9716-9a3c0fcb2b8c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1035.550814] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Created folder: Project (0ae8688b30f34ef8b7b8149cd9ab113b) in parent group-v449629. [ 1035.550990] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Creating folder: Instances. Parent ref: group-v449697. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1035.551222] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-743dfcd6-cf14-46cb-935a-d0d7e54b9ad8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1035.560536] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Created folder: Instances in parent group-v449697. [ 1035.560720] env[60764]: DEBUG oslo.service.loopingcall [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1035.560889] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1035.561086] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2f654a2b-e25f-4082-905c-0fadfffeb00f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1035.579868] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1035.579868] env[60764]: value = "task-2204945" [ 1035.579868] env[60764]: _type = "Task" [ 1035.579868] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1035.586766] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204945, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1036.090106] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204945, 'name': CreateVM_Task, 'duration_secs': 0.30595} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1036.090362] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1036.090991] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1036.091170] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1036.091490] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1036.091738] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e56fcfc2-90a2-4ebb-bd96-bc896f086ea6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1036.096256] env[60764]: DEBUG oslo_vmware.api [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Waiting for the task: (returnval){ [ 1036.096256] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52da82b4-62f9-8ee3-22ab-8db39b3a116f" [ 1036.096256] env[60764]: _type = "Task" [ 1036.096256] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1036.104992] env[60764]: DEBUG oslo_vmware.api [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52da82b4-62f9-8ee3-22ab-8db39b3a116f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1036.605988] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1036.606255] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1036.606464] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1037.144604] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1037.144930] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1037.254157] env[60764]: DEBUG nova.compute.manager [req-06a13a49-443e-47a0-a7fe-2d4fc6afc54a req-b2f2a9f4-b79c-40be-91eb-cdea93d40603 service nova] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Received event network-changed-3bbd8469-e08b-4049-9a9e-27f819f552e2 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1037.254157] env[60764]: DEBUG nova.compute.manager [req-06a13a49-443e-47a0-a7fe-2d4fc6afc54a req-b2f2a9f4-b79c-40be-91eb-cdea93d40603 service nova] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Refreshing instance network info cache due to event network-changed-3bbd8469-e08b-4049-9a9e-27f819f552e2. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1037.254157] env[60764]: DEBUG oslo_concurrency.lockutils [req-06a13a49-443e-47a0-a7fe-2d4fc6afc54a req-b2f2a9f4-b79c-40be-91eb-cdea93d40603 service nova] Acquiring lock "refresh_cache-ce8f8161-623c-4f88-8846-8f3b5a4ceabe" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1037.254157] env[60764]: DEBUG oslo_concurrency.lockutils [req-06a13a49-443e-47a0-a7fe-2d4fc6afc54a req-b2f2a9f4-b79c-40be-91eb-cdea93d40603 service nova] Acquired lock "refresh_cache-ce8f8161-623c-4f88-8846-8f3b5a4ceabe" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1037.254300] env[60764]: DEBUG nova.network.neutron [req-06a13a49-443e-47a0-a7fe-2d4fc6afc54a req-b2f2a9f4-b79c-40be-91eb-cdea93d40603 service nova] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Refreshing network info cache for port 3bbd8469-e08b-4049-9a9e-27f819f552e2 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1037.324699] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1037.329292] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1037.329805] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1037.567050] env[60764]: DEBUG nova.network.neutron [req-06a13a49-443e-47a0-a7fe-2d4fc6afc54a req-b2f2a9f4-b79c-40be-91eb-cdea93d40603 service nova] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Updated VIF entry in instance network info cache for port 3bbd8469-e08b-4049-9a9e-27f819f552e2. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1037.567050] env[60764]: DEBUG nova.network.neutron [req-06a13a49-443e-47a0-a7fe-2d4fc6afc54a req-b2f2a9f4-b79c-40be-91eb-cdea93d40603 service nova] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Updating instance_info_cache with network_info: [{"id": "3bbd8469-e08b-4049-9a9e-27f819f552e2", "address": "fa:16:3e:bb:0f:83", "network": {"id": "99317aa1-57d1-4a75-b1d6-c09ab320506e", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-29667545-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0ae8688b30f34ef8b7b8149cd9ab113b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3bbd8469-e0", "ovs_interfaceid": "3bbd8469-e08b-4049-9a9e-27f819f552e2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1037.575608] env[60764]: DEBUG oslo_concurrency.lockutils [req-06a13a49-443e-47a0-a7fe-2d4fc6afc54a req-b2f2a9f4-b79c-40be-91eb-cdea93d40603 service nova] Releasing lock "refresh_cache-ce8f8161-623c-4f88-8846-8f3b5a4ceabe" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1038.329420] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1041.331804] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1041.332185] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1042.703952] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e44b8ea1-9a96-4520-aabf-d8f4f797e43b tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Acquiring lock "ce8f8161-623c-4f88-8846-8f3b5a4ceabe" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1079.031519] env[60764]: WARNING oslo_vmware.rw_handles [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1079.031519] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1079.031519] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1079.031519] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1079.031519] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1079.031519] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1079.031519] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1079.031519] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1079.031519] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1079.031519] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1079.031519] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1079.031519] env[60764]: ERROR oslo_vmware.rw_handles [ 1079.032178] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/76557603-34f9-41f3-8463-c54d466eac02/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1079.034677] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1079.034974] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Copying Virtual Disk [datastore2] vmware_temp/76557603-34f9-41f3-8463-c54d466eac02/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/76557603-34f9-41f3-8463-c54d466eac02/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1079.035309] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3decfaef-58f8-4967-ba9c-5e491adf4706 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.044378] env[60764]: DEBUG oslo_vmware.api [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Waiting for the task: (returnval){ [ 1079.044378] env[60764]: value = "task-2204946" [ 1079.044378] env[60764]: _type = "Task" [ 1079.044378] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1079.052466] env[60764]: DEBUG oslo_vmware.api [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Task: {'id': task-2204946, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1079.554838] env[60764]: DEBUG oslo_vmware.exceptions [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1079.555083] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1079.556142] env[60764]: ERROR nova.compute.manager [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1079.556142] env[60764]: Faults: ['InvalidArgument'] [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] Traceback (most recent call last): [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] yield resources [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] self.driver.spawn(context, instance, image_meta, [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] self._fetch_image_if_missing(context, vi) [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] image_cache(vi, tmp_image_ds_loc) [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] vm_util.copy_virtual_disk( [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] session._wait_for_task(vmdk_copy_task) [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] return self.wait_for_task(task_ref) [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] return evt.wait() [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] result = hub.switch() [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] return self.greenlet.switch() [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] self.f(*self.args, **self.kw) [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] raise exceptions.translate_fault(task_info.error) [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] Faults: ['InvalidArgument'] [ 1079.556142] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] [ 1079.556142] env[60764]: INFO nova.compute.manager [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Terminating instance [ 1079.558493] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1079.558493] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1079.558493] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2d7dac36-35e7-460d-bf98-b10d1d30e921 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.560547] env[60764]: DEBUG nova.compute.manager [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1079.560737] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1079.561481] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b773a1e4-95ac-4cd6-a10e-cb500ac1f101 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.568587] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1079.568885] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b716d0e8-a524-434f-b409-dd47cc7173da {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.572367] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1079.572367] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1079.573029] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e600af37-17fc-4484-a744-f644924a6924 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.577840] env[60764]: DEBUG oslo_vmware.api [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Waiting for the task: (returnval){ [ 1079.577840] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52785d76-ad1e-a5dd-7334-7ca4ccb9fe5b" [ 1079.577840] env[60764]: _type = "Task" [ 1079.577840] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1079.586728] env[60764]: DEBUG oslo_vmware.api [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52785d76-ad1e-a5dd-7334-7ca4ccb9fe5b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1079.644610] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1079.644764] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1079.644954] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Deleting the datastore file [datastore2] b846d9ae-759a-4898-9ede-091819325701 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1079.645238] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-05dfad95-e286-4e03-b56f-150c49141bf7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.651853] env[60764]: DEBUG oslo_vmware.api [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Waiting for the task: (returnval){ [ 1079.651853] env[60764]: value = "task-2204948" [ 1079.651853] env[60764]: _type = "Task" [ 1079.651853] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1079.660200] env[60764]: DEBUG oslo_vmware.api [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Task: {'id': task-2204948, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1080.088143] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1080.088421] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Creating directory with path [datastore2] vmware_temp/36a57a82-2c02-4303-aeb6-26a53af82c95/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1080.088662] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bb1f4f89-9023-4ab7-9cad-28207c96808f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.100547] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Created directory with path [datastore2] vmware_temp/36a57a82-2c02-4303-aeb6-26a53af82c95/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1080.100772] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Fetch image to [datastore2] vmware_temp/36a57a82-2c02-4303-aeb6-26a53af82c95/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1080.100964] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/36a57a82-2c02-4303-aeb6-26a53af82c95/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1080.101857] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0937f2d8-54f7-4f20-9bc1-f726dbf2bd15 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.108657] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b83ef6b0-4cd3-4017-815a-d5028820b6a7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.118101] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b47b7058-5f3e-4eb0-ac17-257c6b353064 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.149816] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ac09bbe-bca5-464c-b6f5-560189e5dfd5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.162091] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-47e1d09a-a901-4f49-9a81-5efe5fc1ef2a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.163948] env[60764]: DEBUG oslo_vmware.api [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Task: {'id': task-2204948, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07588} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1080.164218] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1080.164401] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1080.164625] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1080.164729] env[60764]: INFO nova.compute.manager [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1080.167318] env[60764]: DEBUG nova.compute.claims [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1080.167487] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1080.167727] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1080.186084] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1080.264072] env[60764]: DEBUG oslo_vmware.rw_handles [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/36a57a82-2c02-4303-aeb6-26a53af82c95/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1080.326123] env[60764]: DEBUG oslo_vmware.rw_handles [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1080.326123] env[60764]: DEBUG oslo_vmware.rw_handles [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/36a57a82-2c02-4303-aeb6-26a53af82c95/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1080.652782] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3385d0d4-c578-4c97-a9e0-9e64d70627c7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.660849] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b12f8129-b28f-40a8-b7e6-e230e3791956 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.693855] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f009bbd4-a4aa-405a-b955-4a522ce285ac {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.700019] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1c406ed-dffe-42c3-aad7-cdf3c0306ba7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.713017] env[60764]: DEBUG nova.compute.provider_tree [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1080.727993] env[60764]: DEBUG nova.scheduler.client.report [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1080.742822] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.575s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1080.743420] env[60764]: ERROR nova.compute.manager [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1080.743420] env[60764]: Faults: ['InvalidArgument'] [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] Traceback (most recent call last): [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] self.driver.spawn(context, instance, image_meta, [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] self._fetch_image_if_missing(context, vi) [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] image_cache(vi, tmp_image_ds_loc) [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] vm_util.copy_virtual_disk( [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] session._wait_for_task(vmdk_copy_task) [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] return self.wait_for_task(task_ref) [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] return evt.wait() [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] result = hub.switch() [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] return self.greenlet.switch() [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] self.f(*self.args, **self.kw) [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] raise exceptions.translate_fault(task_info.error) [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] Faults: ['InvalidArgument'] [ 1080.743420] env[60764]: ERROR nova.compute.manager [instance: b846d9ae-759a-4898-9ede-091819325701] [ 1080.744348] env[60764]: DEBUG nova.compute.utils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1080.745742] env[60764]: DEBUG nova.compute.manager [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Build of instance b846d9ae-759a-4898-9ede-091819325701 was re-scheduled: A specified parameter was not correct: fileType [ 1080.745742] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1080.746169] env[60764]: DEBUG nova.compute.manager [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1080.746308] env[60764]: DEBUG nova.compute.manager [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1080.746558] env[60764]: DEBUG nova.compute.manager [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1080.746755] env[60764]: DEBUG nova.network.neutron [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1081.148873] env[60764]: DEBUG nova.network.neutron [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1081.160122] env[60764]: INFO nova.compute.manager [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Took 0.41 seconds to deallocate network for instance. [ 1081.273040] env[60764]: INFO nova.scheduler.client.report [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Deleted allocations for instance b846d9ae-759a-4898-9ede-091819325701 [ 1081.307105] env[60764]: DEBUG oslo_concurrency.lockutils [None req-7b4a37b6-0091-4ca6-bc0b-ce955cf24b9e tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Lock "b846d9ae-759a-4898-9ede-091819325701" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 523.501s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1081.307105] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d09146f6-8779-4d1d-b5f2-374de8b337e7 tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Lock "b846d9ae-759a-4898-9ede-091819325701" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 324.709s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1081.307105] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d09146f6-8779-4d1d-b5f2-374de8b337e7 tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Acquiring lock "b846d9ae-759a-4898-9ede-091819325701-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1081.307105] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d09146f6-8779-4d1d-b5f2-374de8b337e7 tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Lock "b846d9ae-759a-4898-9ede-091819325701-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1081.307847] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d09146f6-8779-4d1d-b5f2-374de8b337e7 tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Lock "b846d9ae-759a-4898-9ede-091819325701-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1081.311257] env[60764]: INFO nova.compute.manager [None req-d09146f6-8779-4d1d-b5f2-374de8b337e7 tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Terminating instance [ 1081.313818] env[60764]: DEBUG nova.compute.manager [None req-d09146f6-8779-4d1d-b5f2-374de8b337e7 tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1081.314171] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d09146f6-8779-4d1d-b5f2-374de8b337e7 tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1081.316118] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-55ac986c-5522-4b73-bbde-c045acb7725a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.327075] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-551e13d6-69f8-40fe-be99-197b5b249e35 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.341765] env[60764]: DEBUG nova.compute.manager [None req-e018a361-fc91-49e3-962e-84d00cf37b33 tempest-ServerAddressesNegativeTestJSON-2136288688 tempest-ServerAddressesNegativeTestJSON-2136288688-project-member] [instance: 9c01648d-4b7d-460a-b73d-e7324cf251e8] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1081.364836] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-d09146f6-8779-4d1d-b5f2-374de8b337e7 tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b846d9ae-759a-4898-9ede-091819325701 could not be found. [ 1081.365208] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d09146f6-8779-4d1d-b5f2-374de8b337e7 tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1081.365437] env[60764]: INFO nova.compute.manager [None req-d09146f6-8779-4d1d-b5f2-374de8b337e7 tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] [instance: b846d9ae-759a-4898-9ede-091819325701] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1081.365723] env[60764]: DEBUG oslo.service.loopingcall [None req-d09146f6-8779-4d1d-b5f2-374de8b337e7 tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1081.365998] env[60764]: DEBUG nova.compute.manager [-] [instance: b846d9ae-759a-4898-9ede-091819325701] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1081.366111] env[60764]: DEBUG nova.network.neutron [-] [instance: b846d9ae-759a-4898-9ede-091819325701] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1081.369678] env[60764]: DEBUG nova.compute.manager [None req-e018a361-fc91-49e3-962e-84d00cf37b33 tempest-ServerAddressesNegativeTestJSON-2136288688 tempest-ServerAddressesNegativeTestJSON-2136288688-project-member] [instance: 9c01648d-4b7d-460a-b73d-e7324cf251e8] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1081.408435] env[60764]: DEBUG nova.network.neutron [-] [instance: b846d9ae-759a-4898-9ede-091819325701] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1081.413452] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e018a361-fc91-49e3-962e-84d00cf37b33 tempest-ServerAddressesNegativeTestJSON-2136288688 tempest-ServerAddressesNegativeTestJSON-2136288688-project-member] Lock "9c01648d-4b7d-460a-b73d-e7324cf251e8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.731s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1081.418682] env[60764]: INFO nova.compute.manager [-] [instance: b846d9ae-759a-4898-9ede-091819325701] Took 0.05 seconds to deallocate network for instance. [ 1081.423605] env[60764]: DEBUG nova.compute.manager [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1081.495024] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1081.495024] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1081.495679] env[60764]: INFO nova.compute.claims [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1081.560832] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d09146f6-8779-4d1d-b5f2-374de8b337e7 tempest-ServerPasswordTestJSON-382129813 tempest-ServerPasswordTestJSON-382129813-project-member] Lock "b846d9ae-759a-4898-9ede-091819325701" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.254s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1081.910974] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93bc624d-5a3d-498a-9184-09c4fffa30c7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.924925] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b3bf225-2da0-485f-b5d3-a6b84afd34d8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.960350] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08288e13-623d-4574-9736-11e9905acc79 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.968051] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87afa2d4-ae65-43e8-ac61-1b40ea591152 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.983094] env[60764]: DEBUG nova.compute.provider_tree [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1081.995395] env[60764]: DEBUG nova.scheduler.client.report [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1082.010411] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.517s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1082.012286] env[60764]: DEBUG nova.compute.manager [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1082.050379] env[60764]: DEBUG nova.compute.utils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1082.051217] env[60764]: DEBUG nova.compute.manager [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1082.053666] env[60764]: DEBUG nova.network.neutron [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1082.062932] env[60764]: DEBUG nova.compute.manager [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1082.129702] env[60764]: DEBUG nova.policy [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1f32cefba46c4c1699d69409b4eb6147', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2a4e6f7c3621435f897f8009f1693251', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1082.135897] env[60764]: DEBUG nova.compute.manager [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1082.163916] env[60764]: DEBUG nova.virt.hardware [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1082.163916] env[60764]: DEBUG nova.virt.hardware [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1082.163916] env[60764]: DEBUG nova.virt.hardware [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1082.164315] env[60764]: DEBUG nova.virt.hardware [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1082.164315] env[60764]: DEBUG nova.virt.hardware [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1082.164415] env[60764]: DEBUG nova.virt.hardware [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1082.164628] env[60764]: DEBUG nova.virt.hardware [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1082.164782] env[60764]: DEBUG nova.virt.hardware [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1082.164948] env[60764]: DEBUG nova.virt.hardware [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1082.165117] env[60764]: DEBUG nova.virt.hardware [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1082.165285] env[60764]: DEBUG nova.virt.hardware [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1082.166166] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5884fbe1-4f38-49ad-b160-e818f0376aad {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1082.174686] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e46f25dc-f81a-4560-bbe6-c702c16d7d55 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1082.628599] env[60764]: DEBUG nova.network.neutron [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Successfully created port: 71dfa984-60dd-4c44-b1d4-9639276d87e8 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1083.096916] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Acquiring lock "6397ff19-1385-4e38-b199-666394582582" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1083.097270] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Lock "6397ff19-1385-4e38-b199-666394582582" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1083.343237] env[60764]: DEBUG nova.compute.manager [req-e19e0e0d-d806-4762-b078-ac54bb38f941 req-0101d2c1-eb44-440f-9d02-642cf3857401 service nova] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Received event network-vif-plugged-71dfa984-60dd-4c44-b1d4-9639276d87e8 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1083.343519] env[60764]: DEBUG oslo_concurrency.lockutils [req-e19e0e0d-d806-4762-b078-ac54bb38f941 req-0101d2c1-eb44-440f-9d02-642cf3857401 service nova] Acquiring lock "aad42e7f-24c2-400e-8a1c-6baae2081e29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1083.343647] env[60764]: DEBUG oslo_concurrency.lockutils [req-e19e0e0d-d806-4762-b078-ac54bb38f941 req-0101d2c1-eb44-440f-9d02-642cf3857401 service nova] Lock "aad42e7f-24c2-400e-8a1c-6baae2081e29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1083.343867] env[60764]: DEBUG oslo_concurrency.lockutils [req-e19e0e0d-d806-4762-b078-ac54bb38f941 req-0101d2c1-eb44-440f-9d02-642cf3857401 service nova] Lock "aad42e7f-24c2-400e-8a1c-6baae2081e29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1083.344351] env[60764]: DEBUG nova.compute.manager [req-e19e0e0d-d806-4762-b078-ac54bb38f941 req-0101d2c1-eb44-440f-9d02-642cf3857401 service nova] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] No waiting events found dispatching network-vif-plugged-71dfa984-60dd-4c44-b1d4-9639276d87e8 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1083.344575] env[60764]: WARNING nova.compute.manager [req-e19e0e0d-d806-4762-b078-ac54bb38f941 req-0101d2c1-eb44-440f-9d02-642cf3857401 service nova] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Received unexpected event network-vif-plugged-71dfa984-60dd-4c44-b1d4-9639276d87e8 for instance with vm_state building and task_state spawning. [ 1083.368402] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquiring lock "aad42e7f-24c2-400e-8a1c-6baae2081e29" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1083.418116] env[60764]: DEBUG nova.network.neutron [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Successfully updated port: 71dfa984-60dd-4c44-b1d4-9639276d87e8 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1083.433855] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquiring lock "refresh_cache-aad42e7f-24c2-400e-8a1c-6baae2081e29" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1083.433855] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquired lock "refresh_cache-aad42e7f-24c2-400e-8a1c-6baae2081e29" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1083.433855] env[60764]: DEBUG nova.network.neutron [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1083.485510] env[60764]: DEBUG nova.network.neutron [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1083.698535] env[60764]: DEBUG nova.network.neutron [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Updating instance_info_cache with network_info: [{"id": "71dfa984-60dd-4c44-b1d4-9639276d87e8", "address": "fa:16:3e:40:19:68", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap71dfa984-60", "ovs_interfaceid": "71dfa984-60dd-4c44-b1d4-9639276d87e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1083.711234] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Releasing lock "refresh_cache-aad42e7f-24c2-400e-8a1c-6baae2081e29" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1083.711534] env[60764]: DEBUG nova.compute.manager [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Instance network_info: |[{"id": "71dfa984-60dd-4c44-b1d4-9639276d87e8", "address": "fa:16:3e:40:19:68", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap71dfa984-60", "ovs_interfaceid": "71dfa984-60dd-4c44-b1d4-9639276d87e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1083.711930] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:40:19:68', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9f87a752-ebb0-49a4-a67b-e356fa45b89b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '71dfa984-60dd-4c44-b1d4-9639276d87e8', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1083.719942] env[60764]: DEBUG oslo.service.loopingcall [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1083.720439] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1083.720666] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-96944a65-2785-44c8-9ceb-b0f0890283d4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1083.743021] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1083.743021] env[60764]: value = "task-2204949" [ 1083.743021] env[60764]: _type = "Task" [ 1083.743021] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1083.748915] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204949, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1084.250986] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204949, 'name': CreateVM_Task, 'duration_secs': 0.287659} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1084.251189] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1084.251748] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1084.251908] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1084.252240] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1084.252476] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-00e5b79d-2d20-4603-be61-ff4ae47d147a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1084.257590] env[60764]: DEBUG oslo_vmware.api [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Waiting for the task: (returnval){ [ 1084.257590] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5276a19f-9128-4267-f4a2-24b286cd69c1" [ 1084.257590] env[60764]: _type = "Task" [ 1084.257590] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1084.265551] env[60764]: DEBUG oslo_vmware.api [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5276a19f-9128-4267-f4a2-24b286cd69c1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1084.768997] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1084.769274] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1084.769513] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1085.388647] env[60764]: DEBUG nova.compute.manager [req-b1418a34-b10b-4ba4-8755-8c8eb71b63e3 req-80bcaaa9-e690-48d5-ae75-7703c5e4c760 service nova] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Received event network-changed-71dfa984-60dd-4c44-b1d4-9639276d87e8 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1085.388852] env[60764]: DEBUG nova.compute.manager [req-b1418a34-b10b-4ba4-8755-8c8eb71b63e3 req-80bcaaa9-e690-48d5-ae75-7703c5e4c760 service nova] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Refreshing instance network info cache due to event network-changed-71dfa984-60dd-4c44-b1d4-9639276d87e8. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1085.389077] env[60764]: DEBUG oslo_concurrency.lockutils [req-b1418a34-b10b-4ba4-8755-8c8eb71b63e3 req-80bcaaa9-e690-48d5-ae75-7703c5e4c760 service nova] Acquiring lock "refresh_cache-aad42e7f-24c2-400e-8a1c-6baae2081e29" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1085.389221] env[60764]: DEBUG oslo_concurrency.lockutils [req-b1418a34-b10b-4ba4-8755-8c8eb71b63e3 req-80bcaaa9-e690-48d5-ae75-7703c5e4c760 service nova] Acquired lock "refresh_cache-aad42e7f-24c2-400e-8a1c-6baae2081e29" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1085.389382] env[60764]: DEBUG nova.network.neutron [req-b1418a34-b10b-4ba4-8755-8c8eb71b63e3 req-80bcaaa9-e690-48d5-ae75-7703c5e4c760 service nova] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Refreshing network info cache for port 71dfa984-60dd-4c44-b1d4-9639276d87e8 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1085.766944] env[60764]: DEBUG nova.network.neutron [req-b1418a34-b10b-4ba4-8755-8c8eb71b63e3 req-80bcaaa9-e690-48d5-ae75-7703c5e4c760 service nova] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Updated VIF entry in instance network info cache for port 71dfa984-60dd-4c44-b1d4-9639276d87e8. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1085.767305] env[60764]: DEBUG nova.network.neutron [req-b1418a34-b10b-4ba4-8755-8c8eb71b63e3 req-80bcaaa9-e690-48d5-ae75-7703c5e4c760 service nova] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Updating instance_info_cache with network_info: [{"id": "71dfa984-60dd-4c44-b1d4-9639276d87e8", "address": "fa:16:3e:40:19:68", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap71dfa984-60", "ovs_interfaceid": "71dfa984-60dd-4c44-b1d4-9639276d87e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1085.775948] env[60764]: DEBUG oslo_concurrency.lockutils [req-b1418a34-b10b-4ba4-8755-8c8eb71b63e3 req-80bcaaa9-e690-48d5-ae75-7703c5e4c760 service nova] Releasing lock "refresh_cache-aad42e7f-24c2-400e-8a1c-6baae2081e29" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1089.330659] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1089.330903] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Cleaning up deleted instances with incomplete migration {{(pid=60764) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1089.824170] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "a3199f59-f827-404e-8272-296129096180" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1089.824536] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "a3199f59-f827-404e-8272-296129096180" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1093.330816] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1094.337736] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1094.338072] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1094.338072] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1094.359279] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.359518] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.359619] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.359768] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.359941] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.360016] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.360192] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.360312] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.360429] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.360547] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1094.360658] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1095.329509] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1095.341560] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1095.341842] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1095.341935] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1095.342099] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1095.343339] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-faafb6ae-8882-46fd-8a87-8b2542d2dde3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1095.353338] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d19c66ec-9139-4bd5-bb7c-7a2809dc83f4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1095.366874] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9c60b08-2437-45aa-9573-025ca4ae8655 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1095.373014] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d9f26ea-f1f5-4e05-bff2-994dbbe71652 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1095.401947] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181217MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1095.402132] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1095.402325] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1095.548462] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1095.548462] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 437e0c0d-6d0e-4465-9651-14e420b646ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1095.548462] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 1f11c625-166f-4609-badf-da4dd9475c37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1095.548696] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6af62a04-4a13-46c7-a0b2-28768c789f23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1095.548696] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7a843233-c56c-4d87-aeb0-2ffaa441b021 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1095.548803] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 74b4bba7-8568-4fc4-a744-395a3271abc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1095.548921] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance dfd3e3af-90c9-420b-81ec-e9115c519016 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1095.549053] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e80dd396-f709-48d7-bc98-159b175f5593 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1095.549174] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ce8f8161-623c-4f88-8846-8f3b5a4ceabe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1095.549289] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aad42e7f-24c2-400e-8a1c-6baae2081e29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1095.560447] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1095.570643] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 594e3624-e282-4695-a6a7-88ab1e2ddfff has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1095.580130] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ba8d2109-e600-4992-b997-c998ae288b59 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1095.590679] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance d8838301-49a7-4291-8091-6fc90fabc7bb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1095.599066] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 69874f31-5316-4a8e-be6c-f77ac9f6ffbc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1095.607962] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 919607ac-6116-49a7-a575-aff30f9e4c86 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1095.618319] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c0c847ad-0c16-4796-a28c-9efdd19b7096 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1095.627846] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 33f526a8-730d-4264-b24a-1dd892343a15 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1095.636673] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f7cbb5ac-1fcf-457a-adea-bce8e9765699 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1095.645617] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 74709301-6eae-40c1-b987-4be9262ef7ce has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1095.654019] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b3ca6987-3415-4db5-a514-cd66c342eb7f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1095.664924] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance d8e88a52-0f72-4824-9ab5-3ebc8b3509bb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1095.673181] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 75218609-125a-4cb1-90c8-8a508951d9a9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1095.682061] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c5cfe38e-479e-4823-8ed7-2de3f31f47f7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1095.691637] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6397ff19-1385-4e38-b199-666394582582 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1095.700613] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a3199f59-f827-404e-8272-296129096180 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1095.700840] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1095.700983] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1095.716912] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Refreshing inventories for resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1095.731860] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Updating ProviderTree inventory for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1095.732078] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Updating inventory in ProviderTree for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1095.742982] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Refreshing aggregate associations for resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4, aggregates: None {{(pid=60764) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1095.761946] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Refreshing trait associations for resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_VMDK {{(pid=60764) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1096.042327] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7247c43-64e1-4510-b39d-352e0a41ef57 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1096.050027] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-715eb090-2c7f-40b2-a3dd-b316eb1d7a75 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1096.079532] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-955ecd4e-8a5d-44c6-a98d-7dea2c3965ba {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1096.086889] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3d86769-d804-4628-9940-5a431528f83c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1096.101083] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1096.109738] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1096.131470] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1096.131756] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.729s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1096.330636] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1096.330887] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1096.354041] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Getting list of instances from cluster (obj){ [ 1096.354041] env[60764]: value = "domain-c8" [ 1096.354041] env[60764]: _type = "ClusterComputeResource" [ 1096.354041] env[60764]: } {{(pid=60764) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1096.355456] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d7461b2-deee-4c97-a88b-29c8fd047632 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1096.373482] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Got total of 10 instances {{(pid=60764) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1096.373662] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5 {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1096.373856] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid 437e0c0d-6d0e-4465-9651-14e420b646ae {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1096.374037] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid 1f11c625-166f-4609-badf-da4dd9475c37 {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1096.374242] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid 6af62a04-4a13-46c7-a0b2-28768c789f23 {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1096.374435] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid 7a843233-c56c-4d87-aeb0-2ffaa441b021 {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1096.374630] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid 74b4bba7-8568-4fc4-a744-395a3271abc8 {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1096.374778] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid dfd3e3af-90c9-420b-81ec-e9115c519016 {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1096.374961] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid e80dd396-f709-48d7-bc98-159b175f5593 {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1096.375082] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid ce8f8161-623c-4f88-8846-8f3b5a4ceabe {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1096.375228] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid aad42e7f-24c2-400e-8a1c-6baae2081e29 {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1096.375549] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "4e82aa9c-ae76-4b49-b666-5d5adc22d1b5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1096.375778] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "437e0c0d-6d0e-4465-9651-14e420b646ae" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1096.375973] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "1f11c625-166f-4609-badf-da4dd9475c37" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1096.376186] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "6af62a04-4a13-46c7-a0b2-28768c789f23" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1096.376430] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "7a843233-c56c-4d87-aeb0-2ffaa441b021" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1096.376633] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "74b4bba7-8568-4fc4-a744-395a3271abc8" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1096.376823] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "dfd3e3af-90c9-420b-81ec-e9115c519016" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1096.377020] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "e80dd396-f709-48d7-bc98-159b175f5593" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1096.377212] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "ce8f8161-623c-4f88-8846-8f3b5a4ceabe" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1096.377397] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "aad42e7f-24c2-400e-8a1c-6baae2081e29" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1096.377620] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1096.377751] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Cleaning up deleted instances {{(pid=60764) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1096.392136] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] There are 1 instances to clean {{(pid=60764) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1096.392399] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c2ee6cc7-f9df-43d4-b3f8-455fcf431e0e] Instance has had 0 of 5 cleanup attempts {{(pid=60764) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11211}} [ 1097.423492] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1097.446376] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1098.330510] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1098.330828] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1098.331043] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1100.330133] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1101.330609] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1101.330876] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1126.911123] env[60764]: WARNING oslo_vmware.rw_handles [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1126.911123] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1126.911123] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1126.911123] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1126.911123] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1126.911123] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1126.911123] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1126.911123] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1126.911123] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1126.911123] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1126.911123] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1126.911123] env[60764]: ERROR oslo_vmware.rw_handles [ 1126.911783] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/36a57a82-2c02-4303-aeb6-26a53af82c95/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1126.913820] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1126.914210] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Copying Virtual Disk [datastore2] vmware_temp/36a57a82-2c02-4303-aeb6-26a53af82c95/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/36a57a82-2c02-4303-aeb6-26a53af82c95/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1126.914871] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b46a05bb-8dbb-4b3b-991c-8dce755a074c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1126.922392] env[60764]: DEBUG oslo_vmware.api [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Waiting for the task: (returnval){ [ 1126.922392] env[60764]: value = "task-2204950" [ 1126.922392] env[60764]: _type = "Task" [ 1126.922392] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1126.930525] env[60764]: DEBUG oslo_vmware.api [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Task: {'id': task-2204950, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1127.432390] env[60764]: DEBUG oslo_vmware.exceptions [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1127.432684] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1127.433251] env[60764]: ERROR nova.compute.manager [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1127.433251] env[60764]: Faults: ['InvalidArgument'] [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Traceback (most recent call last): [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] yield resources [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] self.driver.spawn(context, instance, image_meta, [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] self._fetch_image_if_missing(context, vi) [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] image_cache(vi, tmp_image_ds_loc) [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] vm_util.copy_virtual_disk( [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] session._wait_for_task(vmdk_copy_task) [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] return self.wait_for_task(task_ref) [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] return evt.wait() [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] result = hub.switch() [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] return self.greenlet.switch() [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] self.f(*self.args, **self.kw) [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] raise exceptions.translate_fault(task_info.error) [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Faults: ['InvalidArgument'] [ 1127.433251] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] [ 1127.434391] env[60764]: INFO nova.compute.manager [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Terminating instance [ 1127.435118] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1127.435326] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1127.435566] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b650af7c-ef68-4ed5-b0f6-6286b5777273 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.437935] env[60764]: DEBUG nova.compute.manager [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1127.438141] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1127.438867] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b30acca3-6dd1-4920-894c-92976370e5df {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.445502] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1127.445709] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d0ce993b-98e3-459c-8965-843e828b410f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.447765] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1127.447931] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1127.448870] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-749967c5-9548-418a-b289-92c250500552 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.453762] env[60764]: DEBUG oslo_vmware.api [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Waiting for the task: (returnval){ [ 1127.453762] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]521c089e-5225-9f5d-9532-93e1f97113ce" [ 1127.453762] env[60764]: _type = "Task" [ 1127.453762] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1127.465631] env[60764]: DEBUG oslo_vmware.api [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]521c089e-5225-9f5d-9532-93e1f97113ce, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1127.511258] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1127.511474] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1127.511657] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Deleting the datastore file [datastore2] 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1127.511923] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-71d91aeb-d475-4fb8-9933-b4642b6a596d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.518635] env[60764]: DEBUG oslo_vmware.api [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Waiting for the task: (returnval){ [ 1127.518635] env[60764]: value = "task-2204952" [ 1127.518635] env[60764]: _type = "Task" [ 1127.518635] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1127.526262] env[60764]: DEBUG oslo_vmware.api [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Task: {'id': task-2204952, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1127.966315] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1127.966627] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Creating directory with path [datastore2] vmware_temp/60eeb4ce-a310-4e99-9eeb-2c2f99766dea/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1127.966627] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-90cee499-28a2-4cda-9e01-a79b528a0554 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.977787] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Created directory with path [datastore2] vmware_temp/60eeb4ce-a310-4e99-9eeb-2c2f99766dea/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1127.978010] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Fetch image to [datastore2] vmware_temp/60eeb4ce-a310-4e99-9eeb-2c2f99766dea/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1127.978232] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/60eeb4ce-a310-4e99-9eeb-2c2f99766dea/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1127.978979] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8535c023-40c4-4d42-aa14-ceef6d5346ee {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.985951] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc565cbb-8555-4b44-a7a4-baf9e4483a7a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.995126] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab4ee53e-1d7d-4008-bf69-b66fc6fc0a61 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1128.028808] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc1655da-fab1-4115-af5d-3746c39d11b9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1128.036142] env[60764]: DEBUG oslo_vmware.api [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Task: {'id': task-2204952, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082263} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1128.037652] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1128.037876] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1128.038079] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1128.038258] env[60764]: INFO nova.compute.manager [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1128.040063] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f5f1155e-df56-4ac5-b90c-a1716950f14b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1128.041983] env[60764]: DEBUG nova.compute.claims [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1128.042175] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1128.042389] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1128.068338] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1128.134032] env[60764]: DEBUG oslo_vmware.rw_handles [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/60eeb4ce-a310-4e99-9eeb-2c2f99766dea/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1128.194248] env[60764]: DEBUG oslo_vmware.rw_handles [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1128.194434] env[60764]: DEBUG oslo_vmware.rw_handles [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/60eeb4ce-a310-4e99-9eeb-2c2f99766dea/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1128.508981] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50be2054-61e4-4c62-acbb-00d1c5c0158a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1128.516484] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a06b68d1-694a-4bd0-aee3-800cd05b8ba4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1128.547028] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-646f805f-638a-4d4e-852c-3368d029451e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1128.554218] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-829f18d4-a149-4621-ad13-3faa8d924229 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1128.567015] env[60764]: DEBUG nova.compute.provider_tree [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1128.576632] env[60764]: DEBUG nova.scheduler.client.report [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1128.592857] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.550s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1128.593370] env[60764]: ERROR nova.compute.manager [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1128.593370] env[60764]: Faults: ['InvalidArgument'] [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Traceback (most recent call last): [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] self.driver.spawn(context, instance, image_meta, [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] self._fetch_image_if_missing(context, vi) [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] image_cache(vi, tmp_image_ds_loc) [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] vm_util.copy_virtual_disk( [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] session._wait_for_task(vmdk_copy_task) [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] return self.wait_for_task(task_ref) [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] return evt.wait() [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] result = hub.switch() [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] return self.greenlet.switch() [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] self.f(*self.args, **self.kw) [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] raise exceptions.translate_fault(task_info.error) [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Faults: ['InvalidArgument'] [ 1128.593370] env[60764]: ERROR nova.compute.manager [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] [ 1128.594316] env[60764]: DEBUG nova.compute.utils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1128.595431] env[60764]: DEBUG nova.compute.manager [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Build of instance 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5 was re-scheduled: A specified parameter was not correct: fileType [ 1128.595431] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1128.595791] env[60764]: DEBUG nova.compute.manager [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1128.595961] env[60764]: DEBUG nova.compute.manager [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1128.596179] env[60764]: DEBUG nova.compute.manager [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1128.596359] env[60764]: DEBUG nova.network.neutron [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1129.208432] env[60764]: DEBUG nova.network.neutron [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1129.221883] env[60764]: INFO nova.compute.manager [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Took 0.62 seconds to deallocate network for instance. [ 1129.335511] env[60764]: INFO nova.scheduler.client.report [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Deleted allocations for instance 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5 [ 1129.357916] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3435d165-1fe2-442e-8b32-deac427ac018 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Lock "4e82aa9c-ae76-4b49-b666-5d5adc22d1b5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 569.748s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1129.359060] env[60764]: DEBUG oslo_concurrency.lockutils [None req-669cf744-fea9-4e33-8635-5d16dd18e0d0 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Lock "4e82aa9c-ae76-4b49-b666-5d5adc22d1b5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 370.938s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1129.359279] env[60764]: DEBUG oslo_concurrency.lockutils [None req-669cf744-fea9-4e33-8635-5d16dd18e0d0 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Acquiring lock "4e82aa9c-ae76-4b49-b666-5d5adc22d1b5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1129.359480] env[60764]: DEBUG oslo_concurrency.lockutils [None req-669cf744-fea9-4e33-8635-5d16dd18e0d0 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Lock "4e82aa9c-ae76-4b49-b666-5d5adc22d1b5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1129.359638] env[60764]: DEBUG oslo_concurrency.lockutils [None req-669cf744-fea9-4e33-8635-5d16dd18e0d0 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Lock "4e82aa9c-ae76-4b49-b666-5d5adc22d1b5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1129.361669] env[60764]: INFO nova.compute.manager [None req-669cf744-fea9-4e33-8635-5d16dd18e0d0 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Terminating instance [ 1129.363377] env[60764]: DEBUG nova.compute.manager [None req-669cf744-fea9-4e33-8635-5d16dd18e0d0 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1129.363576] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-669cf744-fea9-4e33-8635-5d16dd18e0d0 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1129.364040] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f97d3e3c-2b7e-433c-9baf-343f9585f8b6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1129.375400] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97111604-f4c7-4da3-bfb9-9fcb1925c45e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1129.385889] env[60764]: DEBUG nova.compute.manager [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1129.406743] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-669cf744-fea9-4e33-8635-5d16dd18e0d0 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5 could not be found. [ 1129.406959] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-669cf744-fea9-4e33-8635-5d16dd18e0d0 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1129.407222] env[60764]: INFO nova.compute.manager [None req-669cf744-fea9-4e33-8635-5d16dd18e0d0 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1129.407508] env[60764]: DEBUG oslo.service.loopingcall [None req-669cf744-fea9-4e33-8635-5d16dd18e0d0 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1129.407772] env[60764]: DEBUG nova.compute.manager [-] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1129.407903] env[60764]: DEBUG nova.network.neutron [-] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1129.433153] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1129.433446] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1129.434948] env[60764]: INFO nova.compute.claims [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1129.465151] env[60764]: DEBUG nova.network.neutron [-] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1129.474106] env[60764]: INFO nova.compute.manager [-] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] Took 0.07 seconds to deallocate network for instance. [ 1129.589616] env[60764]: DEBUG oslo_concurrency.lockutils [None req-669cf744-fea9-4e33-8635-5d16dd18e0d0 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Lock "4e82aa9c-ae76-4b49-b666-5d5adc22d1b5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.230s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1129.590215] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "4e82aa9c-ae76-4b49-b666-5d5adc22d1b5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 33.215s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1129.590394] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 4e82aa9c-ae76-4b49-b666-5d5adc22d1b5] During sync_power_state the instance has a pending task (deleting). Skip. [ 1129.590566] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "4e82aa9c-ae76-4b49-b666-5d5adc22d1b5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1129.759530] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36189413-d9b9-401d-8fb1-789093f25cbd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1129.766904] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-118ac0b8-25ac-446f-918d-c30c6e3593cc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1129.797733] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe9d024b-b44f-4c74-95ce-fc5c56f0b517 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1129.804910] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b336ba07-bd17-4c69-b18b-9b46b36651be {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1129.817647] env[60764]: DEBUG nova.compute.provider_tree [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1129.827747] env[60764]: DEBUG nova.scheduler.client.report [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1129.840661] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.407s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1129.841171] env[60764]: DEBUG nova.compute.manager [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1129.875178] env[60764]: DEBUG nova.compute.utils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1129.878042] env[60764]: DEBUG nova.compute.manager [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1129.878042] env[60764]: DEBUG nova.network.neutron [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1129.885418] env[60764]: DEBUG nova.compute.manager [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1129.938385] env[60764]: DEBUG nova.policy [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ea610bd090174ada998e90ae80c12d26', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '68087e4eee9a47cf8e001c2772f75aec', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1129.952487] env[60764]: DEBUG nova.compute.manager [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1129.979326] env[60764]: DEBUG nova.virt.hardware [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1129.979552] env[60764]: DEBUG nova.virt.hardware [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1129.979707] env[60764]: DEBUG nova.virt.hardware [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1129.979896] env[60764]: DEBUG nova.virt.hardware [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1129.980084] env[60764]: DEBUG nova.virt.hardware [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1129.980237] env[60764]: DEBUG nova.virt.hardware [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1129.980439] env[60764]: DEBUG nova.virt.hardware [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1129.980597] env[60764]: DEBUG nova.virt.hardware [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1129.980761] env[60764]: DEBUG nova.virt.hardware [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1129.980916] env[60764]: DEBUG nova.virt.hardware [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1129.981100] env[60764]: DEBUG nova.virt.hardware [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1129.982019] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7847d1c-3b62-4db4-8864-fe1ac08f916e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1129.989776] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af81019e-f31d-4918-898c-fdf0b21354f4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1130.326495] env[60764]: DEBUG nova.network.neutron [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Successfully created port: f6822e8c-c9b4-40c2-a478-d555f26991bc {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1131.050546] env[60764]: DEBUG nova.network.neutron [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Successfully updated port: f6822e8c-c9b4-40c2-a478-d555f26991bc {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1131.062622] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Acquiring lock "refresh_cache-f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1131.062807] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Acquired lock "refresh_cache-f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1131.062964] env[60764]: DEBUG nova.network.neutron [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1131.120970] env[60764]: DEBUG nova.network.neutron [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1131.335598] env[60764]: DEBUG nova.network.neutron [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Updating instance_info_cache with network_info: [{"id": "f6822e8c-c9b4-40c2-a478-d555f26991bc", "address": "fa:16:3e:0f:e1:f5", "network": {"id": "8fa370dd-5280-4188-8cdb-b120c7c988a1", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1800253634-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "68087e4eee9a47cf8e001c2772f75aec", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6399297e-11b6-47b0-9a9f-712bb90b6ea1", "external-id": "nsx-vlan-transportzone-213", "segmentation_id": 213, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf6822e8c-c9", "ovs_interfaceid": "f6822e8c-c9b4-40c2-a478-d555f26991bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1131.349070] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Releasing lock "refresh_cache-f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1131.349441] env[60764]: DEBUG nova.compute.manager [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Instance network_info: |[{"id": "f6822e8c-c9b4-40c2-a478-d555f26991bc", "address": "fa:16:3e:0f:e1:f5", "network": {"id": "8fa370dd-5280-4188-8cdb-b120c7c988a1", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1800253634-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "68087e4eee9a47cf8e001c2772f75aec", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6399297e-11b6-47b0-9a9f-712bb90b6ea1", "external-id": "nsx-vlan-transportzone-213", "segmentation_id": 213, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf6822e8c-c9", "ovs_interfaceid": "f6822e8c-c9b4-40c2-a478-d555f26991bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1131.349846] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0f:e1:f5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6399297e-11b6-47b0-9a9f-712bb90b6ea1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f6822e8c-c9b4-40c2-a478-d555f26991bc', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1131.358044] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Creating folder: Project (68087e4eee9a47cf8e001c2772f75aec). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1131.358044] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f5655206-b124-4d5c-8b1f-591f8bcb73be {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1131.368952] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Created folder: Project (68087e4eee9a47cf8e001c2772f75aec) in parent group-v449629. [ 1131.369160] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Creating folder: Instances. Parent ref: group-v449701. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1131.369401] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1a320658-b01f-4e77-901e-d5b50443dacb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1131.378297] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Created folder: Instances in parent group-v449701. [ 1131.378297] env[60764]: DEBUG oslo.service.loopingcall [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1131.378476] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1131.378630] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ed145000-d627-4103-8313-c4bc00814c5e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1131.397850] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1131.397850] env[60764]: value = "task-2204955" [ 1131.397850] env[60764]: _type = "Task" [ 1131.397850] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1131.405208] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204955, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1131.491406] env[60764]: DEBUG nova.compute.manager [req-75973911-66d0-44a9-b572-d28eb317832a req-b8e07b91-7063-4ea4-afb1-208683dba8af service nova] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Received event network-vif-plugged-f6822e8c-c9b4-40c2-a478-d555f26991bc {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1131.491738] env[60764]: DEBUG oslo_concurrency.lockutils [req-75973911-66d0-44a9-b572-d28eb317832a req-b8e07b91-7063-4ea4-afb1-208683dba8af service nova] Acquiring lock "f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1131.492036] env[60764]: DEBUG oslo_concurrency.lockutils [req-75973911-66d0-44a9-b572-d28eb317832a req-b8e07b91-7063-4ea4-afb1-208683dba8af service nova] Lock "f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1131.492337] env[60764]: DEBUG oslo_concurrency.lockutils [req-75973911-66d0-44a9-b572-d28eb317832a req-b8e07b91-7063-4ea4-afb1-208683dba8af service nova] Lock "f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1131.492537] env[60764]: DEBUG nova.compute.manager [req-75973911-66d0-44a9-b572-d28eb317832a req-b8e07b91-7063-4ea4-afb1-208683dba8af service nova] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] No waiting events found dispatching network-vif-plugged-f6822e8c-c9b4-40c2-a478-d555f26991bc {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1131.492754] env[60764]: WARNING nova.compute.manager [req-75973911-66d0-44a9-b572-d28eb317832a req-b8e07b91-7063-4ea4-afb1-208683dba8af service nova] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Received unexpected event network-vif-plugged-f6822e8c-c9b4-40c2-a478-d555f26991bc for instance with vm_state building and task_state spawning. [ 1131.492927] env[60764]: DEBUG nova.compute.manager [req-75973911-66d0-44a9-b572-d28eb317832a req-b8e07b91-7063-4ea4-afb1-208683dba8af service nova] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Received event network-changed-f6822e8c-c9b4-40c2-a478-d555f26991bc {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1131.493091] env[60764]: DEBUG nova.compute.manager [req-75973911-66d0-44a9-b572-d28eb317832a req-b8e07b91-7063-4ea4-afb1-208683dba8af service nova] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Refreshing instance network info cache due to event network-changed-f6822e8c-c9b4-40c2-a478-d555f26991bc. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1131.493281] env[60764]: DEBUG oslo_concurrency.lockutils [req-75973911-66d0-44a9-b572-d28eb317832a req-b8e07b91-7063-4ea4-afb1-208683dba8af service nova] Acquiring lock "refresh_cache-f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1131.493436] env[60764]: DEBUG oslo_concurrency.lockutils [req-75973911-66d0-44a9-b572-d28eb317832a req-b8e07b91-7063-4ea4-afb1-208683dba8af service nova] Acquired lock "refresh_cache-f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1131.493559] env[60764]: DEBUG nova.network.neutron [req-75973911-66d0-44a9-b572-d28eb317832a req-b8e07b91-7063-4ea4-afb1-208683dba8af service nova] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Refreshing network info cache for port f6822e8c-c9b4-40c2-a478-d555f26991bc {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1131.841173] env[60764]: DEBUG nova.network.neutron [req-75973911-66d0-44a9-b572-d28eb317832a req-b8e07b91-7063-4ea4-afb1-208683dba8af service nova] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Updated VIF entry in instance network info cache for port f6822e8c-c9b4-40c2-a478-d555f26991bc. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1131.841539] env[60764]: DEBUG nova.network.neutron [req-75973911-66d0-44a9-b572-d28eb317832a req-b8e07b91-7063-4ea4-afb1-208683dba8af service nova] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Updating instance_info_cache with network_info: [{"id": "f6822e8c-c9b4-40c2-a478-d555f26991bc", "address": "fa:16:3e:0f:e1:f5", "network": {"id": "8fa370dd-5280-4188-8cdb-b120c7c988a1", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1800253634-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "68087e4eee9a47cf8e001c2772f75aec", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6399297e-11b6-47b0-9a9f-712bb90b6ea1", "external-id": "nsx-vlan-transportzone-213", "segmentation_id": 213, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf6822e8c-c9", "ovs_interfaceid": "f6822e8c-c9b4-40c2-a478-d555f26991bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1131.850716] env[60764]: DEBUG oslo_concurrency.lockutils [req-75973911-66d0-44a9-b572-d28eb317832a req-b8e07b91-7063-4ea4-afb1-208683dba8af service nova] Releasing lock "refresh_cache-f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1131.907694] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204955, 'name': CreateVM_Task, 'duration_secs': 0.301924} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1131.907865] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1131.908597] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1131.908758] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1131.909132] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1131.909377] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-28eab7f4-5393-4ef4-828f-6ae95889ef36 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1131.913845] env[60764]: DEBUG oslo_vmware.api [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Waiting for the task: (returnval){ [ 1131.913845] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52b7f63b-081a-23f7-ae42-4d50b186a48a" [ 1131.913845] env[60764]: _type = "Task" [ 1131.913845] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1131.923884] env[60764]: DEBUG oslo_vmware.api [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52b7f63b-081a-23f7-ae42-4d50b186a48a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1132.424694] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1132.424954] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1132.425172] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1137.909701] env[60764]: DEBUG oslo_concurrency.lockutils [None req-30c28746-ca62-4bb1-a984-9725b79d22f4 tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Acquiring lock "f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1151.216757] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquiring lock "c645f7f5-528b-4719-96dd-8e50a46b4261" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1151.216757] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "c645f7f5-528b-4719-96dd-8e50a46b4261" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1155.330150] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1155.330434] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1155.330769] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1155.365487] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.365660] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.365786] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.365918] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.366107] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.366263] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.366361] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.366478] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.366585] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.366695] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1155.367243] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1156.330669] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1156.348057] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1156.348057] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1156.348155] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1156.348743] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1156.351186] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9c95012-614d-4581-b012-4cea21f39d92 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.362895] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0ba1b37-2edf-4703-8c5d-2c193e8ea971 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.382153] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82d6bef2-90af-4639-a74a-246a9a706eaf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.391267] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3dbc0c0-5f56-40ee-b2e6-1bf933c52250 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1156.431013] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181250MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1156.431539] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1156.431539] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1156.521670] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 437e0c0d-6d0e-4465-9651-14e420b646ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1156.524025] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 1f11c625-166f-4609-badf-da4dd9475c37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1156.524025] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6af62a04-4a13-46c7-a0b2-28768c789f23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1156.524025] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7a843233-c56c-4d87-aeb0-2ffaa441b021 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1156.524025] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 74b4bba7-8568-4fc4-a744-395a3271abc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1156.524025] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance dfd3e3af-90c9-420b-81ec-e9115c519016 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1156.524025] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e80dd396-f709-48d7-bc98-159b175f5593 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1156.524025] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ce8f8161-623c-4f88-8846-8f3b5a4ceabe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1156.524025] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aad42e7f-24c2-400e-8a1c-6baae2081e29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1156.524025] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1156.539739] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 69874f31-5316-4a8e-be6c-f77ac9f6ffbc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1156.552898] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 919607ac-6116-49a7-a575-aff30f9e4c86 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1156.575207] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c0c847ad-0c16-4796-a28c-9efdd19b7096 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1156.594158] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 33f526a8-730d-4264-b24a-1dd892343a15 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1156.607339] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f7cbb5ac-1fcf-457a-adea-bce8e9765699 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1156.619884] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 74709301-6eae-40c1-b987-4be9262ef7ce has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1156.631947] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b3ca6987-3415-4db5-a514-cd66c342eb7f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1156.643715] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance d8e88a52-0f72-4824-9ab5-3ebc8b3509bb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1156.658483] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 75218609-125a-4cb1-90c8-8a508951d9a9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1156.673734] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c5cfe38e-479e-4823-8ed7-2de3f31f47f7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1156.686914] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6397ff19-1385-4e38-b199-666394582582 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1156.698824] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a3199f59-f827-404e-8272-296129096180 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1156.710185] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c645f7f5-528b-4719-96dd-8e50a46b4261 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1156.710433] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1156.710577] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1157.056155] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-573310b3-83b6-4652-a0c4-145e5c701604 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.064454] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac2bbb7c-df71-4675-ada0-ac1ff3b69070 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.096235] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5689dae4-3605-43ef-bdf1-c52b211477b2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.103552] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-035b1cf2-8965-42d3-bb9e-f539faea248d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.117815] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1157.126355] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1157.142692] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1157.143117] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.711s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1157.755341] env[60764]: DEBUG oslo_concurrency.lockutils [None req-08f96d72-175b-439b-9ccf-2ceeaa03052f tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Acquiring lock "9debc548-4034-4ba7-93a4-915fc6ad3229" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1157.755687] env[60764]: DEBUG oslo_concurrency.lockutils [None req-08f96d72-175b-439b-9ccf-2ceeaa03052f tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Lock "9debc548-4034-4ba7-93a4-915fc6ad3229" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1158.142913] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1158.333065] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1158.333065] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1158.333065] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1159.984665] env[60764]: DEBUG oslo_concurrency.lockutils [None req-28f9bae5-7746-4e94-a457-441894b0ca0d tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "c8ddfd42-c132-48e3-bade-f103a1bdea07" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1159.984996] env[60764]: DEBUG oslo_concurrency.lockutils [None req-28f9bae5-7746-4e94-a457-441894b0ca0d tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "c8ddfd42-c132-48e3-bade-f103a1bdea07" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1160.326102] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1161.330669] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1161.330928] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1162.331420] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1165.195476] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bc90d127-57da-484a-a5ec-aaece59bdaf5 tempest-ServersAdminTestJSON-275582914 tempest-ServersAdminTestJSON-275582914-project-member] Acquiring lock "8b2206e9-2b46-44f8-a756-80be988926a4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1165.195476] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bc90d127-57da-484a-a5ec-aaece59bdaf5 tempest-ServersAdminTestJSON-275582914 tempest-ServersAdminTestJSON-275582914-project-member] Lock "8b2206e9-2b46-44f8-a756-80be988926a4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1165.637676] env[60764]: DEBUG oslo_concurrency.lockutils [None req-38395585-9e79-4c53-b9ea-0234f09eb1b9 tempest-ServersAdminTestJSON-275582914 tempest-ServersAdminTestJSON-275582914-project-member] Acquiring lock "94b4adb4-6119-489a-820e-701790136809" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1165.637676] env[60764]: DEBUG oslo_concurrency.lockutils [None req-38395585-9e79-4c53-b9ea-0234f09eb1b9 tempest-ServersAdminTestJSON-275582914 tempest-ServersAdminTestJSON-275582914-project-member] Lock "94b4adb4-6119-489a-820e-701790136809" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1168.531569] env[60764]: DEBUG oslo_concurrency.lockutils [None req-88ac6484-7180-4d99-85f1-68409d37ad73 tempest-ListImageFiltersTestJSON-1608500095 tempest-ListImageFiltersTestJSON-1608500095-project-member] Acquiring lock "b0b63493-2864-4767-a20c-83db66f395c6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1168.532946] env[60764]: DEBUG oslo_concurrency.lockutils [None req-88ac6484-7180-4d99-85f1-68409d37ad73 tempest-ListImageFiltersTestJSON-1608500095 tempest-ListImageFiltersTestJSON-1608500095-project-member] Lock "b0b63493-2864-4767-a20c-83db66f395c6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1168.854885] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3d8629e2-1b5f-482f-8979-ff064b10ddb3 tempest-ListImageFiltersTestJSON-1608500095 tempest-ListImageFiltersTestJSON-1608500095-project-member] Acquiring lock "f1e869e4-179c-4ea1-9a50-c560e9d2f78b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1168.855313] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3d8629e2-1b5f-482f-8979-ff064b10ddb3 tempest-ListImageFiltersTestJSON-1608500095 tempest-ListImageFiltersTestJSON-1608500095-project-member] Lock "f1e869e4-179c-4ea1-9a50-c560e9d2f78b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1174.063942] env[60764]: WARNING oslo_vmware.rw_handles [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1174.063942] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1174.063942] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1174.063942] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1174.063942] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1174.063942] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1174.063942] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1174.063942] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1174.063942] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1174.063942] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1174.063942] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1174.063942] env[60764]: ERROR oslo_vmware.rw_handles [ 1174.064623] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/60eeb4ce-a310-4e99-9eeb-2c2f99766dea/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1174.066501] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1174.066737] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Copying Virtual Disk [datastore2] vmware_temp/60eeb4ce-a310-4e99-9eeb-2c2f99766dea/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/60eeb4ce-a310-4e99-9eeb-2c2f99766dea/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1174.067075] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-011ff047-a047-450b-ae48-242bb6a53d7e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1174.075364] env[60764]: DEBUG oslo_vmware.api [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Waiting for the task: (returnval){ [ 1174.075364] env[60764]: value = "task-2204956" [ 1174.075364] env[60764]: _type = "Task" [ 1174.075364] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1174.083294] env[60764]: DEBUG oslo_vmware.api [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Task: {'id': task-2204956, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1174.586891] env[60764]: DEBUG oslo_vmware.exceptions [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1174.587219] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1174.587890] env[60764]: ERROR nova.compute.manager [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1174.587890] env[60764]: Faults: ['InvalidArgument'] [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Traceback (most recent call last): [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] yield resources [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] self.driver.spawn(context, instance, image_meta, [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] self._fetch_image_if_missing(context, vi) [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] image_cache(vi, tmp_image_ds_loc) [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] vm_util.copy_virtual_disk( [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] session._wait_for_task(vmdk_copy_task) [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] return self.wait_for_task(task_ref) [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] return evt.wait() [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] result = hub.switch() [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] return self.greenlet.switch() [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] self.f(*self.args, **self.kw) [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] raise exceptions.translate_fault(task_info.error) [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Faults: ['InvalidArgument'] [ 1174.587890] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] [ 1174.588966] env[60764]: INFO nova.compute.manager [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Terminating instance [ 1174.589689] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1174.589901] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1174.590162] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6a112937-5de6-4dc1-806d-e0fccf228dcb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1174.592544] env[60764]: DEBUG nova.compute.manager [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1174.592734] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1174.593544] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-def53368-ac6b-4d06-ba83-9620cd8965aa {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1174.601203] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1174.601203] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-704251b2-85ac-4f67-80e6-e693ff54e878 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1174.603050] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1174.603224] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1174.604219] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c4b15879-8f1a-4aca-bcce-cf6a9d2b317d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1174.609327] env[60764]: DEBUG oslo_vmware.api [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Waiting for the task: (returnval){ [ 1174.609327] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52134a99-8ae6-bca7-39d6-b3b210bf020a" [ 1174.609327] env[60764]: _type = "Task" [ 1174.609327] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1174.616267] env[60764]: DEBUG oslo_vmware.api [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52134a99-8ae6-bca7-39d6-b3b210bf020a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1174.673738] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1174.674013] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1174.674222] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Deleting the datastore file [datastore2] 437e0c0d-6d0e-4465-9651-14e420b646ae {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1174.674503] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-45697852-cd07-43c9-931f-65894e26169f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1174.681098] env[60764]: DEBUG oslo_vmware.api [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Waiting for the task: (returnval){ [ 1174.681098] env[60764]: value = "task-2204958" [ 1174.681098] env[60764]: _type = "Task" [ 1174.681098] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1174.689325] env[60764]: DEBUG oslo_vmware.api [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Task: {'id': task-2204958, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1175.120202] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1175.120509] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Creating directory with path [datastore2] vmware_temp/72a349a0-1c84-49bf-835a-fea2279084f8/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1175.120693] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-abb5b3d2-7ed0-4cf1-84ab-8f0ee5f679a8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1175.131712] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Created directory with path [datastore2] vmware_temp/72a349a0-1c84-49bf-835a-fea2279084f8/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1175.131948] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Fetch image to [datastore2] vmware_temp/72a349a0-1c84-49bf-835a-fea2279084f8/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1175.132159] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/72a349a0-1c84-49bf-835a-fea2279084f8/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1175.132940] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9baf4f5c-e0c7-4ddd-9562-1e524343ec1a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1175.139958] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44f1a4c0-f07c-4c28-9575-9bea65e55750 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1175.148849] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5dbce27c-982d-49a8-90c9-5c68fad64bed {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1175.181091] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cef565df-4bef-441b-815e-7adfd194a8ce {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1175.193795] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0a5dd4c4-e99e-463f-9210-c177739df87d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1175.194032] env[60764]: DEBUG oslo_vmware.api [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Task: {'id': task-2204958, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068436} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1175.194270] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1175.194442] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1175.194601] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1175.194762] env[60764]: INFO nova.compute.manager [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1175.196888] env[60764]: DEBUG nova.compute.claims [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1175.197171] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1175.197388] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1175.214596] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1175.472985] env[60764]: DEBUG oslo_vmware.rw_handles [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/72a349a0-1c84-49bf-835a-fea2279084f8/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1175.532826] env[60764]: DEBUG oslo_vmware.rw_handles [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1175.533029] env[60764]: DEBUG oslo_vmware.rw_handles [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/72a349a0-1c84-49bf-835a-fea2279084f8/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1175.607374] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a6bac24-ba05-40a2-bc24-e41f8cb74595 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1175.614849] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c5ad831-4553-4ebc-86a8-e11bc2c2ddfa {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1175.645025] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09557ab0-6933-4ea1-82fb-e655792860f4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1175.652018] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5f03cee-c8a9-4857-8223-eaf7331bbb23 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1175.665122] env[60764]: DEBUG nova.compute.provider_tree [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1175.674627] env[60764]: DEBUG nova.scheduler.client.report [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1175.689792] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.492s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1175.690351] env[60764]: ERROR nova.compute.manager [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1175.690351] env[60764]: Faults: ['InvalidArgument'] [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Traceback (most recent call last): [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] self.driver.spawn(context, instance, image_meta, [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] self._fetch_image_if_missing(context, vi) [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] image_cache(vi, tmp_image_ds_loc) [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] vm_util.copy_virtual_disk( [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] session._wait_for_task(vmdk_copy_task) [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] return self.wait_for_task(task_ref) [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] return evt.wait() [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] result = hub.switch() [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] return self.greenlet.switch() [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] self.f(*self.args, **self.kw) [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] raise exceptions.translate_fault(task_info.error) [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Faults: ['InvalidArgument'] [ 1175.690351] env[60764]: ERROR nova.compute.manager [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] [ 1175.691409] env[60764]: DEBUG nova.compute.utils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1175.692745] env[60764]: DEBUG nova.compute.manager [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Build of instance 437e0c0d-6d0e-4465-9651-14e420b646ae was re-scheduled: A specified parameter was not correct: fileType [ 1175.692745] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1175.693128] env[60764]: DEBUG nova.compute.manager [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1175.693313] env[60764]: DEBUG nova.compute.manager [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1175.693479] env[60764]: DEBUG nova.compute.manager [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1175.693640] env[60764]: DEBUG nova.network.neutron [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1176.156712] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cbfc7571-c035-4164-bcaf-251391837c92 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Acquiring lock "c32c50ad-0818-478e-8cfa-c34902153a2c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1176.159020] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cbfc7571-c035-4164-bcaf-251391837c92 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "c32c50ad-0818-478e-8cfa-c34902153a2c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1176.164785] env[60764]: DEBUG nova.network.neutron [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1176.181828] env[60764]: INFO nova.compute.manager [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Took 0.49 seconds to deallocate network for instance. [ 1176.285683] env[60764]: INFO nova.scheduler.client.report [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Deleted allocations for instance 437e0c0d-6d0e-4465-9651-14e420b646ae [ 1176.308320] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0dde51af-3807-4006-8262-7692a658dbb6 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Lock "437e0c0d-6d0e-4465-9651-14e420b646ae" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 616.120s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1176.309564] env[60764]: DEBUG oslo_concurrency.lockutils [None req-62ef65e4-2609-47c4-9e7c-b51bd5ab9e2a tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Lock "437e0c0d-6d0e-4465-9651-14e420b646ae" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 416.938s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1176.309781] env[60764]: DEBUG oslo_concurrency.lockutils [None req-62ef65e4-2609-47c4-9e7c-b51bd5ab9e2a tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Acquiring lock "437e0c0d-6d0e-4465-9651-14e420b646ae-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1176.309977] env[60764]: DEBUG oslo_concurrency.lockutils [None req-62ef65e4-2609-47c4-9e7c-b51bd5ab9e2a tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Lock "437e0c0d-6d0e-4465-9651-14e420b646ae-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1176.310156] env[60764]: DEBUG oslo_concurrency.lockutils [None req-62ef65e4-2609-47c4-9e7c-b51bd5ab9e2a tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Lock "437e0c0d-6d0e-4465-9651-14e420b646ae-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1176.312367] env[60764]: INFO nova.compute.manager [None req-62ef65e4-2609-47c4-9e7c-b51bd5ab9e2a tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Terminating instance [ 1176.314089] env[60764]: DEBUG nova.compute.manager [None req-62ef65e4-2609-47c4-9e7c-b51bd5ab9e2a tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1176.314300] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-62ef65e4-2609-47c4-9e7c-b51bd5ab9e2a tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1176.314943] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-41776813-3049-4bfa-8d49-324b4ed7fd4f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1176.322914] env[60764]: DEBUG nova.compute.manager [None req-d00705eb-25dc-4917-ba74-f6bfdff21186 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 594e3624-e282-4695-a6a7-88ab1e2ddfff] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1176.328748] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98f7d55f-ca20-4547-9ffd-fff72b6ef92e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1176.357668] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-62ef65e4-2609-47c4-9e7c-b51bd5ab9e2a tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 437e0c0d-6d0e-4465-9651-14e420b646ae could not be found. [ 1176.358664] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-62ef65e4-2609-47c4-9e7c-b51bd5ab9e2a tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1176.358664] env[60764]: INFO nova.compute.manager [None req-62ef65e4-2609-47c4-9e7c-b51bd5ab9e2a tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1176.358664] env[60764]: DEBUG oslo.service.loopingcall [None req-62ef65e4-2609-47c4-9e7c-b51bd5ab9e2a tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1176.358664] env[60764]: DEBUG nova.compute.manager [None req-d00705eb-25dc-4917-ba74-f6bfdff21186 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] [instance: 594e3624-e282-4695-a6a7-88ab1e2ddfff] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1176.359576] env[60764]: DEBUG nova.compute.manager [-] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1176.359643] env[60764]: DEBUG nova.network.neutron [-] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1176.383217] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d00705eb-25dc-4917-ba74-f6bfdff21186 tempest-VolumesAdminNegativeTest-518373138 tempest-VolumesAdminNegativeTest-518373138-project-member] Lock "594e3624-e282-4695-a6a7-88ab1e2ddfff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 228.095s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1176.391303] env[60764]: DEBUG nova.network.neutron [-] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1176.399279] env[60764]: DEBUG nova.compute.manager [None req-dee959c2-bbe7-444c-98c2-94036d087188 tempest-ServerShowV257Test-236663532 tempest-ServerShowV257Test-236663532-project-member] [instance: ba8d2109-e600-4992-b997-c998ae288b59] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1176.401848] env[60764]: INFO nova.compute.manager [-] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] Took 0.04 seconds to deallocate network for instance. [ 1176.422928] env[60764]: DEBUG nova.compute.manager [None req-dee959c2-bbe7-444c-98c2-94036d087188 tempest-ServerShowV257Test-236663532 tempest-ServerShowV257Test-236663532-project-member] [instance: ba8d2109-e600-4992-b997-c998ae288b59] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1176.446675] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dee959c2-bbe7-444c-98c2-94036d087188 tempest-ServerShowV257Test-236663532 tempest-ServerShowV257Test-236663532-project-member] Lock "ba8d2109-e600-4992-b997-c998ae288b59" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 221.649s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1176.456986] env[60764]: DEBUG nova.compute.manager [None req-ff04037b-044e-4ffd-898d-f89290e17190 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: d8838301-49a7-4291-8091-6fc90fabc7bb] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1176.481996] env[60764]: DEBUG nova.compute.manager [None req-ff04037b-044e-4ffd-898d-f89290e17190 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] [instance: d8838301-49a7-4291-8091-6fc90fabc7bb] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1176.491689] env[60764]: DEBUG oslo_concurrency.lockutils [None req-62ef65e4-2609-47c4-9e7c-b51bd5ab9e2a tempest-FloatingIPsAssociationNegativeTestJSON-1947311041 tempest-FloatingIPsAssociationNegativeTestJSON-1947311041-project-member] Lock "437e0c0d-6d0e-4465-9651-14e420b646ae" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.182s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1176.492569] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "437e0c0d-6d0e-4465-9651-14e420b646ae" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 80.117s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1176.492974] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 437e0c0d-6d0e-4465-9651-14e420b646ae] During sync_power_state the instance has a pending task (deleting). Skip. [ 1176.492974] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "437e0c0d-6d0e-4465-9651-14e420b646ae" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1176.506043] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ff04037b-044e-4ffd-898d-f89290e17190 tempest-MigrationsAdminTest-1153113698 tempest-MigrationsAdminTest-1153113698-project-member] Lock "d8838301-49a7-4291-8091-6fc90fabc7bb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 220.641s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1176.514568] env[60764]: DEBUG nova.compute.manager [None req-931ef5a6-6237-4823-b4f4-d4270781bf5b tempest-ListServerFiltersTestJSON-1714896274 tempest-ListServerFiltersTestJSON-1714896274-project-member] [instance: 69874f31-5316-4a8e-be6c-f77ac9f6ffbc] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1176.558850] env[60764]: DEBUG nova.compute.manager [None req-931ef5a6-6237-4823-b4f4-d4270781bf5b tempest-ListServerFiltersTestJSON-1714896274 tempest-ListServerFiltersTestJSON-1714896274-project-member] [instance: 69874f31-5316-4a8e-be6c-f77ac9f6ffbc] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1176.580905] env[60764]: DEBUG oslo_concurrency.lockutils [None req-931ef5a6-6237-4823-b4f4-d4270781bf5b tempest-ListServerFiltersTestJSON-1714896274 tempest-ListServerFiltersTestJSON-1714896274-project-member] Lock "69874f31-5316-4a8e-be6c-f77ac9f6ffbc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 215.174s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1176.595640] env[60764]: DEBUG nova.compute.manager [None req-afe52671-b4ed-4a57-80f1-0e3367827473 tempest-ListServerFiltersTestJSON-1714896274 tempest-ListServerFiltersTestJSON-1714896274-project-member] [instance: 919607ac-6116-49a7-a575-aff30f9e4c86] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1176.620379] env[60764]: DEBUG nova.compute.manager [None req-afe52671-b4ed-4a57-80f1-0e3367827473 tempest-ListServerFiltersTestJSON-1714896274 tempest-ListServerFiltersTestJSON-1714896274-project-member] [instance: 919607ac-6116-49a7-a575-aff30f9e4c86] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1176.643390] env[60764]: DEBUG oslo_concurrency.lockutils [None req-afe52671-b4ed-4a57-80f1-0e3367827473 tempest-ListServerFiltersTestJSON-1714896274 tempest-ListServerFiltersTestJSON-1714896274-project-member] Lock "919607ac-6116-49a7-a575-aff30f9e4c86" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.681s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1176.651425] env[60764]: DEBUG nova.compute.manager [None req-285d52f0-159f-4f3c-92e1-573a0676dbe2 tempest-ListServerFiltersTestJSON-1714896274 tempest-ListServerFiltersTestJSON-1714896274-project-member] [instance: c0c847ad-0c16-4796-a28c-9efdd19b7096] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1176.672510] env[60764]: DEBUG nova.compute.manager [None req-285d52f0-159f-4f3c-92e1-573a0676dbe2 tempest-ListServerFiltersTestJSON-1714896274 tempest-ListServerFiltersTestJSON-1714896274-project-member] [instance: c0c847ad-0c16-4796-a28c-9efdd19b7096] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1176.692102] env[60764]: DEBUG oslo_concurrency.lockutils [None req-285d52f0-159f-4f3c-92e1-573a0676dbe2 tempest-ListServerFiltersTestJSON-1714896274 tempest-ListServerFiltersTestJSON-1714896274-project-member] Lock "c0c847ad-0c16-4796-a28c-9efdd19b7096" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.972s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1176.700117] env[60764]: DEBUG nova.compute.manager [None req-4e271d84-cd3d-4668-a5da-9711aad0d67f tempest-AttachInterfacesV270Test-2106429919 tempest-AttachInterfacesV270Test-2106429919-project-member] [instance: 33f526a8-730d-4264-b24a-1dd892343a15] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1176.722284] env[60764]: DEBUG nova.compute.manager [None req-4e271d84-cd3d-4668-a5da-9711aad0d67f tempest-AttachInterfacesV270Test-2106429919 tempest-AttachInterfacesV270Test-2106429919-project-member] [instance: 33f526a8-730d-4264-b24a-1dd892343a15] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1176.750717] env[60764]: DEBUG oslo_concurrency.lockutils [None req-4e271d84-cd3d-4668-a5da-9711aad0d67f tempest-AttachInterfacesV270Test-2106429919 tempest-AttachInterfacesV270Test-2106429919-project-member] Lock "33f526a8-730d-4264-b24a-1dd892343a15" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.523s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1176.759370] env[60764]: DEBUG nova.compute.manager [None req-52288769-4b2d-45b5-9af8-2178ead9b69f tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f7cbb5ac-1fcf-457a-adea-bce8e9765699] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1176.781633] env[60764]: DEBUG nova.compute.manager [None req-52288769-4b2d-45b5-9af8-2178ead9b69f tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f7cbb5ac-1fcf-457a-adea-bce8e9765699] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1176.800786] env[60764]: DEBUG oslo_concurrency.lockutils [None req-52288769-4b2d-45b5-9af8-2178ead9b69f tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "f7cbb5ac-1fcf-457a-adea-bce8e9765699" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.224s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1176.810049] env[60764]: DEBUG nova.compute.manager [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1176.858345] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1176.858591] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1176.860029] env[60764]: INFO nova.compute.claims [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1177.159086] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b84f473-3919-4f97-a078-d267c1182d17 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1177.166900] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06e1c233-bdc0-4e25-9a70-9dbe668ce5b4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1177.197556] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-351beebc-67f1-443c-bddf-6e5e29ca1252 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1177.204687] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-355ef974-2b91-4136-a5d7-b6a0e7c59997 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1177.217873] env[60764]: DEBUG nova.compute.provider_tree [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1177.227226] env[60764]: DEBUG nova.scheduler.client.report [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1177.241978] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.383s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1177.242445] env[60764]: DEBUG nova.compute.manager [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1177.256963] env[60764]: DEBUG oslo_concurrency.lockutils [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Acquiring lock "74709301-6eae-40c1-b987-4be9262ef7ce" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1177.270255] env[60764]: DEBUG nova.compute.claims [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1177.270438] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1177.270659] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1177.604702] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1129bae3-603e-4d86-bda2-e53b4ebf84ee {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1177.612214] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44a22a1b-a8a0-4da5-b430-c6f3e39f5996 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1177.641461] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3785c7b-57fa-49bd-bed7-b8018cb71fc9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1177.648456] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fd1224d-38a3-4ebc-bc65-b74289fe24a6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1177.661253] env[60764]: DEBUG nova.compute.provider_tree [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1177.670349] env[60764]: DEBUG nova.scheduler.client.report [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1177.683809] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.413s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1177.684577] env[60764]: DEBUG nova.compute.utils [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Conflict updating instance 74709301-6eae-40c1-b987-4be9262ef7ce. Expected: {'task_state': [None]}. Actual: {'task_state': 'deleting'} {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1177.685969] env[60764]: DEBUG nova.compute.manager [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Instance disappeared during build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2487}} [ 1177.686160] env[60764]: DEBUG nova.compute.manager [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1177.686383] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Acquiring lock "refresh_cache-74709301-6eae-40c1-b987-4be9262ef7ce" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1177.686526] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Acquired lock "refresh_cache-74709301-6eae-40c1-b987-4be9262ef7ce" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1177.686681] env[60764]: DEBUG nova.network.neutron [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1177.742082] env[60764]: DEBUG nova.network.neutron [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1177.931508] env[60764]: DEBUG nova.network.neutron [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1177.943825] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Releasing lock "refresh_cache-74709301-6eae-40c1-b987-4be9262ef7ce" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1177.944893] env[60764]: DEBUG nova.compute.manager [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1177.944893] env[60764]: DEBUG nova.compute.manager [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1177.944893] env[60764]: DEBUG nova.network.neutron [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1177.962697] env[60764]: DEBUG nova.network.neutron [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1177.970183] env[60764]: DEBUG nova.network.neutron [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1177.978104] env[60764]: INFO nova.compute.manager [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Took 0.03 seconds to deallocate network for instance. [ 1178.045426] env[60764]: INFO nova.scheduler.client.report [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Deleted allocations for instance 74709301-6eae-40c1-b987-4be9262ef7ce [ 1178.045699] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0d7e30f0-cb04-44bb-9022-9c3c2b191acd tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Lock "74709301-6eae-40c1-b987-4be9262ef7ce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 196.125s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1178.047153] env[60764]: DEBUG oslo_concurrency.lockutils [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Lock "74709301-6eae-40c1-b987-4be9262ef7ce" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.791s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1178.047374] env[60764]: DEBUG oslo_concurrency.lockutils [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Acquiring lock "74709301-6eae-40c1-b987-4be9262ef7ce-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1178.047570] env[60764]: DEBUG oslo_concurrency.lockutils [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Lock "74709301-6eae-40c1-b987-4be9262ef7ce-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1178.047732] env[60764]: DEBUG oslo_concurrency.lockutils [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Lock "74709301-6eae-40c1-b987-4be9262ef7ce-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1178.049928] env[60764]: INFO nova.compute.manager [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Terminating instance [ 1178.051382] env[60764]: DEBUG oslo_concurrency.lockutils [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Acquiring lock "refresh_cache-74709301-6eae-40c1-b987-4be9262ef7ce" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1178.051539] env[60764]: DEBUG oslo_concurrency.lockutils [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Acquired lock "refresh_cache-74709301-6eae-40c1-b987-4be9262ef7ce" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1178.051701] env[60764]: DEBUG nova.network.neutron [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1178.056251] env[60764]: DEBUG nova.compute.manager [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1178.081438] env[60764]: DEBUG nova.network.neutron [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1178.106968] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1178.107264] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1178.108798] env[60764]: INFO nova.compute.claims [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1178.194553] env[60764]: DEBUG nova.network.neutron [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1178.202918] env[60764]: DEBUG oslo_concurrency.lockutils [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Releasing lock "refresh_cache-74709301-6eae-40c1-b987-4be9262ef7ce" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1178.203346] env[60764]: DEBUG nova.compute.manager [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1178.203516] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1178.204065] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d5f5949f-ed6d-4244-8abe-0399a00bc953 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.214347] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd566a59-7b43-4e2b-a8e9-09a978cfb82b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.245685] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 74709301-6eae-40c1-b987-4be9262ef7ce could not be found. [ 1178.245758] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1178.246061] env[60764]: INFO nova.compute.manager [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1178.246424] env[60764]: DEBUG oslo.service.loopingcall [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1178.247395] env[60764]: DEBUG nova.compute.manager [-] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1178.247395] env[60764]: DEBUG nova.network.neutron [-] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1178.265073] env[60764]: DEBUG nova.network.neutron [-] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1178.272175] env[60764]: DEBUG nova.network.neutron [-] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1178.280459] env[60764]: INFO nova.compute.manager [-] [instance: 74709301-6eae-40c1-b987-4be9262ef7ce] Took 0.03 seconds to deallocate network for instance. [ 1178.365182] env[60764]: DEBUG oslo_concurrency.lockutils [None req-95899460-c105-427b-a1c8-a8b7ef063941 tempest-InstanceActionsNegativeTestJSON-1038676421 tempest-InstanceActionsNegativeTestJSON-1038676421-project-member] Lock "74709301-6eae-40c1-b987-4be9262ef7ce" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.318s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1178.432529] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13da1810-99dd-4264-97d2-9add70983c9f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.440291] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45d8154c-e6c5-47b6-9885-b11085a09d50 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.472093] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca82ca79-481d-4b49-8d53-8189a70f584f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.479630] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73aa865d-bfb4-470f-bbcb-e0ebca38a7ed {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.492979] env[60764]: DEBUG nova.compute.provider_tree [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1178.502407] env[60764]: DEBUG nova.scheduler.client.report [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1178.518679] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.411s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1178.519214] env[60764]: DEBUG nova.compute.manager [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1178.561625] env[60764]: DEBUG nova.compute.utils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1178.562924] env[60764]: DEBUG nova.compute.manager [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1178.563140] env[60764]: DEBUG nova.network.neutron [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1178.577216] env[60764]: DEBUG nova.compute.manager [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1178.653030] env[60764]: DEBUG nova.compute.manager [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1178.678866] env[60764]: DEBUG nova.virt.hardware [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1178.679353] env[60764]: DEBUG nova.virt.hardware [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1178.679664] env[60764]: DEBUG nova.virt.hardware [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1178.679961] env[60764]: DEBUG nova.virt.hardware [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1178.680195] env[60764]: DEBUG nova.virt.hardware [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1178.680400] env[60764]: DEBUG nova.virt.hardware [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1178.680689] env[60764]: DEBUG nova.virt.hardware [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1178.680896] env[60764]: DEBUG nova.virt.hardware [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1178.681133] env[60764]: DEBUG nova.virt.hardware [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1178.681356] env[60764]: DEBUG nova.virt.hardware [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1178.681599] env[60764]: DEBUG nova.virt.hardware [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1178.682661] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b784165b-db8a-4704-b84f-67cee5456d38 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.691194] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95f24471-dfd5-4162-99dc-779a1f8681e6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.924847] env[60764]: DEBUG nova.policy [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '109be0ca12884406b4cdacddd180e624', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3b4a239567804395a368e9ed14f322c2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1179.595068] env[60764]: DEBUG nova.network.neutron [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Successfully created port: 227bafb7-d793-4a7d-b210-372c31b483d8 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1180.431085] env[60764]: DEBUG nova.compute.manager [req-63e528fa-448e-4637-a4a1-af7cfbd713db req-349d3f92-0b04-4171-ba45-c2a22533fc12 service nova] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Received event network-vif-plugged-227bafb7-d793-4a7d-b210-372c31b483d8 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1180.431323] env[60764]: DEBUG oslo_concurrency.lockutils [req-63e528fa-448e-4637-a4a1-af7cfbd713db req-349d3f92-0b04-4171-ba45-c2a22533fc12 service nova] Acquiring lock "b3ca6987-3415-4db5-a514-cd66c342eb7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1180.431532] env[60764]: DEBUG oslo_concurrency.lockutils [req-63e528fa-448e-4637-a4a1-af7cfbd713db req-349d3f92-0b04-4171-ba45-c2a22533fc12 service nova] Lock "b3ca6987-3415-4db5-a514-cd66c342eb7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1180.431694] env[60764]: DEBUG oslo_concurrency.lockutils [req-63e528fa-448e-4637-a4a1-af7cfbd713db req-349d3f92-0b04-4171-ba45-c2a22533fc12 service nova] Lock "b3ca6987-3415-4db5-a514-cd66c342eb7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1180.431858] env[60764]: DEBUG nova.compute.manager [req-63e528fa-448e-4637-a4a1-af7cfbd713db req-349d3f92-0b04-4171-ba45-c2a22533fc12 service nova] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] No waiting events found dispatching network-vif-plugged-227bafb7-d793-4a7d-b210-372c31b483d8 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1180.432896] env[60764]: WARNING nova.compute.manager [req-63e528fa-448e-4637-a4a1-af7cfbd713db req-349d3f92-0b04-4171-ba45-c2a22533fc12 service nova] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Received unexpected event network-vif-plugged-227bafb7-d793-4a7d-b210-372c31b483d8 for instance with vm_state building and task_state spawning. [ 1180.517078] env[60764]: DEBUG nova.network.neutron [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Successfully updated port: 227bafb7-d793-4a7d-b210-372c31b483d8 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1180.533971] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Acquiring lock "refresh_cache-b3ca6987-3415-4db5-a514-cd66c342eb7f" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1180.533971] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Acquired lock "refresh_cache-b3ca6987-3415-4db5-a514-cd66c342eb7f" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1180.533971] env[60764]: DEBUG nova.network.neutron [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1180.581865] env[60764]: DEBUG nova.network.neutron [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1180.778079] env[60764]: DEBUG nova.network.neutron [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Updating instance_info_cache with network_info: [{"id": "227bafb7-d793-4a7d-b210-372c31b483d8", "address": "fa:16:3e:6d:67:7f", "network": {"id": "a80be8d9-83a1-4cd8-b694-deff17584f0a", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-840986791-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3b4a239567804395a368e9ed14f322c2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d94740a-bce8-4103-8ecf-230d02ec0a44", "external-id": "nsx-vlan-transportzone-149", "segmentation_id": 149, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap227bafb7-d7", "ovs_interfaceid": "227bafb7-d793-4a7d-b210-372c31b483d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1180.792799] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Releasing lock "refresh_cache-b3ca6987-3415-4db5-a514-cd66c342eb7f" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1180.793411] env[60764]: DEBUG nova.compute.manager [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Instance network_info: |[{"id": "227bafb7-d793-4a7d-b210-372c31b483d8", "address": "fa:16:3e:6d:67:7f", "network": {"id": "a80be8d9-83a1-4cd8-b694-deff17584f0a", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-840986791-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3b4a239567804395a368e9ed14f322c2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d94740a-bce8-4103-8ecf-230d02ec0a44", "external-id": "nsx-vlan-transportzone-149", "segmentation_id": 149, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap227bafb7-d7", "ovs_interfaceid": "227bafb7-d793-4a7d-b210-372c31b483d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1180.793530] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6d:67:7f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2d94740a-bce8-4103-8ecf-230d02ec0a44', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '227bafb7-d793-4a7d-b210-372c31b483d8', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1180.801480] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Creating folder: Project (3b4a239567804395a368e9ed14f322c2). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1180.802080] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-29578dd6-d4c8-479a-a341-c041f16cbb14 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1180.814977] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Created folder: Project (3b4a239567804395a368e9ed14f322c2) in parent group-v449629. [ 1180.815475] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Creating folder: Instances. Parent ref: group-v449704. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1180.815651] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-08f9ad09-2480-4d04-afaa-4608f151eccc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1180.829995] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Created folder: Instances in parent group-v449704. [ 1180.829995] env[60764]: DEBUG oslo.service.loopingcall [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1180.829995] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1180.829995] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a0fa1728-84ef-441f-ad63-aa8449eaf7e8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1180.857664] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1180.857664] env[60764]: value = "task-2204961" [ 1180.857664] env[60764]: _type = "Task" [ 1180.857664] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1180.867932] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204961, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1181.368376] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204961, 'name': CreateVM_Task, 'duration_secs': 0.318198} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1181.368510] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1181.376560] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1181.376725] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1181.377107] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1181.377368] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3f784566-4877-4c68-aaca-bfb805119bb5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1181.383026] env[60764]: DEBUG oslo_vmware.api [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Waiting for the task: (returnval){ [ 1181.383026] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52f80e09-5f1a-0ecf-3686-b37c082dd889" [ 1181.383026] env[60764]: _type = "Task" [ 1181.383026] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1181.398128] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1181.398397] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1181.398621] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1182.473397] env[60764]: DEBUG nova.compute.manager [req-f8a63666-f4d4-4436-9a5e-60bdb08848aa req-c6b2a63f-3b55-4fc1-b65c-f4e550544fe4 service nova] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Received event network-changed-227bafb7-d793-4a7d-b210-372c31b483d8 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1182.473622] env[60764]: DEBUG nova.compute.manager [req-f8a63666-f4d4-4436-9a5e-60bdb08848aa req-c6b2a63f-3b55-4fc1-b65c-f4e550544fe4 service nova] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Refreshing instance network info cache due to event network-changed-227bafb7-d793-4a7d-b210-372c31b483d8. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1182.473804] env[60764]: DEBUG oslo_concurrency.lockutils [req-f8a63666-f4d4-4436-9a5e-60bdb08848aa req-c6b2a63f-3b55-4fc1-b65c-f4e550544fe4 service nova] Acquiring lock "refresh_cache-b3ca6987-3415-4db5-a514-cd66c342eb7f" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1182.473949] env[60764]: DEBUG oslo_concurrency.lockutils [req-f8a63666-f4d4-4436-9a5e-60bdb08848aa req-c6b2a63f-3b55-4fc1-b65c-f4e550544fe4 service nova] Acquired lock "refresh_cache-b3ca6987-3415-4db5-a514-cd66c342eb7f" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1182.474157] env[60764]: DEBUG nova.network.neutron [req-f8a63666-f4d4-4436-9a5e-60bdb08848aa req-c6b2a63f-3b55-4fc1-b65c-f4e550544fe4 service nova] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Refreshing network info cache for port 227bafb7-d793-4a7d-b210-372c31b483d8 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1182.928847] env[60764]: DEBUG oslo_concurrency.lockutils [None req-948ef391-7856-4fa4-b494-ffbcfc080c51 tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Acquiring lock "b3ca6987-3415-4db5-a514-cd66c342eb7f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1182.992622] env[60764]: DEBUG nova.network.neutron [req-f8a63666-f4d4-4436-9a5e-60bdb08848aa req-c6b2a63f-3b55-4fc1-b65c-f4e550544fe4 service nova] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Updated VIF entry in instance network info cache for port 227bafb7-d793-4a7d-b210-372c31b483d8. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1182.992962] env[60764]: DEBUG nova.network.neutron [req-f8a63666-f4d4-4436-9a5e-60bdb08848aa req-c6b2a63f-3b55-4fc1-b65c-f4e550544fe4 service nova] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Updating instance_info_cache with network_info: [{"id": "227bafb7-d793-4a7d-b210-372c31b483d8", "address": "fa:16:3e:6d:67:7f", "network": {"id": "a80be8d9-83a1-4cd8-b694-deff17584f0a", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-840986791-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3b4a239567804395a368e9ed14f322c2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d94740a-bce8-4103-8ecf-230d02ec0a44", "external-id": "nsx-vlan-transportzone-149", "segmentation_id": 149, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap227bafb7-d7", "ovs_interfaceid": "227bafb7-d793-4a7d-b210-372c31b483d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1183.002217] env[60764]: DEBUG oslo_concurrency.lockutils [req-f8a63666-f4d4-4436-9a5e-60bdb08848aa req-c6b2a63f-3b55-4fc1-b65c-f4e550544fe4 service nova] Releasing lock "refresh_cache-b3ca6987-3415-4db5-a514-cd66c342eb7f" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1190.223997] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Acquiring lock "51512549-4c6e-41d4-98b0-7d1e801a8b69" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1190.224445] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Lock "51512549-4c6e-41d4-98b0-7d1e801a8b69" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1199.386930] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1b7fa0df-ee2d-4ed8-a9f7-23f126c65601 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Acquiring lock "8d616729-866d-4ebb-8bc7-cf2172b70382" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1199.387407] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1b7fa0df-ee2d-4ed8-a9f7-23f126c65601 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "8d616729-866d-4ebb-8bc7-cf2172b70382" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1200.924263] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ef83ffa2-ac55-4d6b-bd45-18ad1cc60e83 tempest-AttachVolumeNegativeTest-1914834048 tempest-AttachVolumeNegativeTest-1914834048-project-member] Acquiring lock "5ec9deb5-ad1c-4908-afc8-8e72f8b0cb85" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1200.924542] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ef83ffa2-ac55-4d6b-bd45-18ad1cc60e83 tempest-AttachVolumeNegativeTest-1914834048 tempest-AttachVolumeNegativeTest-1914834048-project-member] Lock "5ec9deb5-ad1c-4908-afc8-8e72f8b0cb85" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1216.331174] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1216.331431] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1216.331502] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1216.353743] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1216.353880] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1216.354079] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1216.354227] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1216.354348] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1216.354468] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1216.354586] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1216.354703] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1216.354819] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1216.354932] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1216.355080] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1218.329186] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1218.329409] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1218.340668] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1218.340883] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1218.341060] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1218.341213] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1218.342317] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4399c58-94c7-41df-8fb2-70110ab6126e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.351281] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c9b08b7-da19-414e-948f-56dec462bb8c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.365029] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e70cf11-aa1d-4d1d-bda2-5728fbb76586 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.371023] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02417b19-cc84-4147-91af-9679ea7337e9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.399425] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181269MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1218.399624] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1218.399851] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1218.472670] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 1f11c625-166f-4609-badf-da4dd9475c37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1218.472823] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6af62a04-4a13-46c7-a0b2-28768c789f23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1218.472948] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7a843233-c56c-4d87-aeb0-2ffaa441b021 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1218.473086] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 74b4bba7-8568-4fc4-a744-395a3271abc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1218.473209] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance dfd3e3af-90c9-420b-81ec-e9115c519016 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1218.473326] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e80dd396-f709-48d7-bc98-159b175f5593 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1218.473441] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ce8f8161-623c-4f88-8846-8f3b5a4ceabe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1218.473570] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aad42e7f-24c2-400e-8a1c-6baae2081e29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1218.473678] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1218.473786] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b3ca6987-3415-4db5-a514-cd66c342eb7f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1218.483906] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6397ff19-1385-4e38-b199-666394582582 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1218.493431] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a3199f59-f827-404e-8272-296129096180 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1218.502759] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c645f7f5-528b-4719-96dd-8e50a46b4261 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1218.512066] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 9debc548-4034-4ba7-93a4-915fc6ad3229 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1218.520623] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c8ddfd42-c132-48e3-bade-f103a1bdea07 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1218.529777] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 8b2206e9-2b46-44f8-a756-80be988926a4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1218.540082] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 94b4adb4-6119-489a-820e-701790136809 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1218.550448] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b0b63493-2864-4767-a20c-83db66f395c6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1218.559824] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f1e869e4-179c-4ea1-9a50-c560e9d2f78b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1218.569644] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c32c50ad-0818-478e-8cfa-c34902153a2c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1218.578381] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 51512549-4c6e-41d4-98b0-7d1e801a8b69 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1218.586935] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 8d616729-866d-4ebb-8bc7-cf2172b70382 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1218.595664] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 5ec9deb5-ad1c-4908-afc8-8e72f8b0cb85 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1218.595886] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1218.596055] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1218.874776] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ad4473c-8c01-4ab4-9bb6-196e1d6052d0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.882715] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd41ba2e-da56-49e8-a48b-fff00abb9316 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.913432] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84411175-69e6-4837-b9d0-e0a2fb2eb5d6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.921677] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7ef1958-79cf-4c12-8a14-3a5b50d88b24 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.936714] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1218.945070] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1218.958480] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1218.958675] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.559s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1219.955405] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1219.979502] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1219.979700] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1220.329864] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1222.325106] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1223.330068] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1223.330068] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1224.083480] env[60764]: WARNING oslo_vmware.rw_handles [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1224.083480] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1224.083480] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1224.083480] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1224.083480] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1224.083480] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1224.083480] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1224.083480] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1224.083480] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1224.083480] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1224.083480] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1224.083480] env[60764]: ERROR oslo_vmware.rw_handles [ 1224.084045] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/72a349a0-1c84-49bf-835a-fea2279084f8/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1224.086366] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1224.086660] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Copying Virtual Disk [datastore2] vmware_temp/72a349a0-1c84-49bf-835a-fea2279084f8/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/72a349a0-1c84-49bf-835a-fea2279084f8/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1224.086985] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-fbe8cfbb-6f3c-4b8b-8683-8150e490502a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1224.096011] env[60764]: DEBUG oslo_vmware.api [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Waiting for the task: (returnval){ [ 1224.096011] env[60764]: value = "task-2204962" [ 1224.096011] env[60764]: _type = "Task" [ 1224.096011] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1224.103573] env[60764]: DEBUG oslo_vmware.api [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Task: {'id': task-2204962, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1224.330852] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1224.606353] env[60764]: DEBUG oslo_vmware.exceptions [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1224.606649] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1224.607253] env[60764]: ERROR nova.compute.manager [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1224.607253] env[60764]: Faults: ['InvalidArgument'] [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Traceback (most recent call last): [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] yield resources [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] self.driver.spawn(context, instance, image_meta, [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] self._fetch_image_if_missing(context, vi) [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] image_cache(vi, tmp_image_ds_loc) [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] vm_util.copy_virtual_disk( [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] session._wait_for_task(vmdk_copy_task) [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] return self.wait_for_task(task_ref) [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] return evt.wait() [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] result = hub.switch() [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] return self.greenlet.switch() [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] self.f(*self.args, **self.kw) [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] raise exceptions.translate_fault(task_info.error) [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Faults: ['InvalidArgument'] [ 1224.607253] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] [ 1224.607965] env[60764]: INFO nova.compute.manager [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Terminating instance [ 1224.609203] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1224.609462] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1224.610024] env[60764]: DEBUG nova.compute.manager [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1224.610217] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1224.610444] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-da9a833c-ee0d-4bbe-a5c6-3157b1fa3ddf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1224.612781] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2247ba6-fb80-4942-b925-8db86ea735ea {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1224.621345] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1224.621345] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-eb9de52a-feeb-403a-9d3e-25ae77e0d3f3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1224.622039] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1224.622240] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1224.623226] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b1a48e24-391c-4d73-8a98-4ce577a9743c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1224.627703] env[60764]: DEBUG oslo_vmware.api [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Waiting for the task: (returnval){ [ 1224.627703] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52257ea2-4fa2-69c4-85ea-dc3cd237eea9" [ 1224.627703] env[60764]: _type = "Task" [ 1224.627703] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1224.635008] env[60764]: DEBUG oslo_vmware.api [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52257ea2-4fa2-69c4-85ea-dc3cd237eea9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1224.691758] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1224.691970] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1224.692172] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Deleting the datastore file [datastore2] 1f11c625-166f-4609-badf-da4dd9475c37 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1224.692446] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-253c03fa-1d4c-46a3-970b-ce26748df4c7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1224.698540] env[60764]: DEBUG oslo_vmware.api [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Waiting for the task: (returnval){ [ 1224.698540] env[60764]: value = "task-2204964" [ 1224.698540] env[60764]: _type = "Task" [ 1224.698540] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1224.706611] env[60764]: DEBUG oslo_vmware.api [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Task: {'id': task-2204964, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1225.137999] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1225.138287] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Creating directory with path [datastore2] vmware_temp/29efc658-4717-4bef-9943-960e5220b3fe/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1225.138506] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-17c3ad29-b1c2-4a27-98ee-29501ffe174a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.149820] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Created directory with path [datastore2] vmware_temp/29efc658-4717-4bef-9943-960e5220b3fe/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1225.149997] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Fetch image to [datastore2] vmware_temp/29efc658-4717-4bef-9943-960e5220b3fe/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1225.150177] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/29efc658-4717-4bef-9943-960e5220b3fe/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1225.150888] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be086ecc-9e3a-43b7-8134-393a04eb79cb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.157150] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0005901-b36d-4f65-9e0d-ec722e647a83 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.165881] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ad594d2-ca26-462a-9b29-682973a129b9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.197020] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40371d76-43e5-41ce-9b87-5d3ed4713408 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.207489] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-efb5ad95-d822-4426-9adf-3b5731564115 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.209119] env[60764]: DEBUG oslo_vmware.api [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Task: {'id': task-2204964, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078958} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1225.209355] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1225.209534] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1225.209700] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1225.209869] env[60764]: INFO nova.compute.manager [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1225.212457] env[60764]: DEBUG nova.compute.claims [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1225.212619] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1225.212828] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1225.228871] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1225.418658] env[60764]: DEBUG oslo_vmware.rw_handles [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/29efc658-4717-4bef-9943-960e5220b3fe/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1225.477801] env[60764]: DEBUG oslo_vmware.rw_handles [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1225.477974] env[60764]: DEBUG oslo_vmware.rw_handles [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/29efc658-4717-4bef-9943-960e5220b3fe/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1225.576223] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bd0cd13-b0fe-426a-8677-c5d90a21c474 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.584575] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db216617-263d-46ce-ab61-acbe478a75bc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.613967] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a27bb8f-d4d2-4ad8-824c-037a3da070c7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.621354] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6147b3fa-2e09-44a6-b26e-596e635008f5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1225.635423] env[60764]: DEBUG nova.compute.provider_tree [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1225.644350] env[60764]: DEBUG nova.scheduler.client.report [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1225.659069] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.446s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1225.659664] env[60764]: ERROR nova.compute.manager [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1225.659664] env[60764]: Faults: ['InvalidArgument'] [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Traceback (most recent call last): [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] self.driver.spawn(context, instance, image_meta, [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] self._fetch_image_if_missing(context, vi) [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] image_cache(vi, tmp_image_ds_loc) [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] vm_util.copy_virtual_disk( [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] session._wait_for_task(vmdk_copy_task) [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] return self.wait_for_task(task_ref) [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] return evt.wait() [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] result = hub.switch() [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] return self.greenlet.switch() [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] self.f(*self.args, **self.kw) [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] raise exceptions.translate_fault(task_info.error) [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Faults: ['InvalidArgument'] [ 1225.659664] env[60764]: ERROR nova.compute.manager [instance: 1f11c625-166f-4609-badf-da4dd9475c37] [ 1225.660442] env[60764]: DEBUG nova.compute.utils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1225.661869] env[60764]: DEBUG nova.compute.manager [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Build of instance 1f11c625-166f-4609-badf-da4dd9475c37 was re-scheduled: A specified parameter was not correct: fileType [ 1225.661869] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1225.662266] env[60764]: DEBUG nova.compute.manager [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1225.662440] env[60764]: DEBUG nova.compute.manager [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1225.662606] env[60764]: DEBUG nova.compute.manager [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1225.662767] env[60764]: DEBUG nova.network.neutron [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1225.999369] env[60764]: DEBUG nova.network.neutron [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1226.010879] env[60764]: INFO nova.compute.manager [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Took 0.35 seconds to deallocate network for instance. [ 1226.110348] env[60764]: INFO nova.scheduler.client.report [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Deleted allocations for instance 1f11c625-166f-4609-badf-da4dd9475c37 [ 1226.131937] env[60764]: DEBUG oslo_concurrency.lockutils [None req-0e90a69a-840f-4023-bb40-dd4c4ec6ec88 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Lock "1f11c625-166f-4609-badf-da4dd9475c37" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 665.006s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1226.133234] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8a74b88-0c1e-4096-88c7-51a841a87639 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Lock "1f11c625-166f-4609-badf-da4dd9475c37" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 466.246s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1226.133418] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8a74b88-0c1e-4096-88c7-51a841a87639 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Acquiring lock "1f11c625-166f-4609-badf-da4dd9475c37-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1226.133650] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8a74b88-0c1e-4096-88c7-51a841a87639 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Lock "1f11c625-166f-4609-badf-da4dd9475c37-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1226.133817] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8a74b88-0c1e-4096-88c7-51a841a87639 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Lock "1f11c625-166f-4609-badf-da4dd9475c37-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1226.135970] env[60764]: INFO nova.compute.manager [None req-d8a74b88-0c1e-4096-88c7-51a841a87639 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Terminating instance [ 1226.137955] env[60764]: DEBUG nova.compute.manager [None req-d8a74b88-0c1e-4096-88c7-51a841a87639 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1226.138038] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d8a74b88-0c1e-4096-88c7-51a841a87639 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1226.138468] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7aa1ecc3-bcbf-48f9-b6f5-120441d61b94 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1226.148887] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f5a13eb-7708-4060-8de4-85677ccfc00b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1226.159291] env[60764]: DEBUG nova.compute.manager [None req-d33e9f4f-2bfa-4c01-be94-c640621d7d8a tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 26c4ebb8-f581-4c06-8000-c80fa09a2d27] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1226.179959] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-d8a74b88-0c1e-4096-88c7-51a841a87639 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1f11c625-166f-4609-badf-da4dd9475c37 could not be found. [ 1226.180214] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d8a74b88-0c1e-4096-88c7-51a841a87639 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1226.180418] env[60764]: INFO nova.compute.manager [None req-d8a74b88-0c1e-4096-88c7-51a841a87639 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1226.180698] env[60764]: DEBUG oslo.service.loopingcall [None req-d8a74b88-0c1e-4096-88c7-51a841a87639 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1226.180943] env[60764]: DEBUG nova.compute.manager [-] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1226.181082] env[60764]: DEBUG nova.network.neutron [-] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1226.183661] env[60764]: DEBUG nova.compute.manager [None req-d33e9f4f-2bfa-4c01-be94-c640621d7d8a tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 26c4ebb8-f581-4c06-8000-c80fa09a2d27] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1226.203871] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d33e9f4f-2bfa-4c01-be94-c640621d7d8a tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "26c4ebb8-f581-4c06-8000-c80fa09a2d27" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.822s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1226.206516] env[60764]: DEBUG nova.network.neutron [-] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1226.214068] env[60764]: INFO nova.compute.manager [-] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] Took 0.03 seconds to deallocate network for instance. [ 1226.215973] env[60764]: DEBUG nova.compute.manager [None req-9a71ea8c-1078-4427-9be8-e2bdd9010d95 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: d8e88a52-0f72-4824-9ab5-3ebc8b3509bb] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1226.238183] env[60764]: DEBUG nova.compute.manager [None req-9a71ea8c-1078-4427-9be8-e2bdd9010d95 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: d8e88a52-0f72-4824-9ab5-3ebc8b3509bb] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1226.260423] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9a71ea8c-1078-4427-9be8-e2bdd9010d95 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "d8e88a52-0f72-4824-9ab5-3ebc8b3509bb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 225.069s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1226.273618] env[60764]: DEBUG nova.compute.manager [None req-492806f5-08d6-463e-bf4d-04d890dd7cdf tempest-AttachVolumeNegativeTest-1914834048 tempest-AttachVolumeNegativeTest-1914834048-project-member] [instance: 75218609-125a-4cb1-90c8-8a508951d9a9] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1226.302954] env[60764]: DEBUG nova.compute.manager [None req-492806f5-08d6-463e-bf4d-04d890dd7cdf tempest-AttachVolumeNegativeTest-1914834048 tempest-AttachVolumeNegativeTest-1914834048-project-member] [instance: 75218609-125a-4cb1-90c8-8a508951d9a9] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1226.314273] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8a74b88-0c1e-4096-88c7-51a841a87639 tempest-ServersWithSpecificFlavorTestJSON-1813561322 tempest-ServersWithSpecificFlavorTestJSON-1813561322-project-member] Lock "1f11c625-166f-4609-badf-da4dd9475c37" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.181s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1226.315387] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "1f11c625-166f-4609-badf-da4dd9475c37" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 129.939s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1226.315808] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 1f11c625-166f-4609-badf-da4dd9475c37] During sync_power_state the instance has a pending task (deleting). Skip. [ 1226.315934] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "1f11c625-166f-4609-badf-da4dd9475c37" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1226.324794] env[60764]: DEBUG oslo_concurrency.lockutils [None req-492806f5-08d6-463e-bf4d-04d890dd7cdf tempest-AttachVolumeNegativeTest-1914834048 tempest-AttachVolumeNegativeTest-1914834048-project-member] Lock "75218609-125a-4cb1-90c8-8a508951d9a9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.436s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1226.333026] env[60764]: DEBUG nova.compute.manager [None req-89f2f3f7-dd74-4003-969e-c689003b239d tempest-ServerTagsTestJSON-125331092 tempest-ServerTagsTestJSON-125331092-project-member] [instance: c5cfe38e-479e-4823-8ed7-2de3f31f47f7] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1226.354841] env[60764]: DEBUG nova.compute.manager [None req-89f2f3f7-dd74-4003-969e-c689003b239d tempest-ServerTagsTestJSON-125331092 tempest-ServerTagsTestJSON-125331092-project-member] [instance: c5cfe38e-479e-4823-8ed7-2de3f31f47f7] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1226.376983] env[60764]: DEBUG oslo_concurrency.lockutils [None req-89f2f3f7-dd74-4003-969e-c689003b239d tempest-ServerTagsTestJSON-125331092 tempest-ServerTagsTestJSON-125331092-project-member] Lock "c5cfe38e-479e-4823-8ed7-2de3f31f47f7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.940s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1226.386069] env[60764]: DEBUG nova.compute.manager [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1226.436063] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1226.436394] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1226.437919] env[60764]: INFO nova.compute.claims [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1226.756614] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d055190-0ef8-4d5e-9884-d80c33d41aff {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1226.764703] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9c30654-9c07-4dd7-b9e7-d4c609aa2a1d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1226.794269] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f243750-1fc5-4953-8155-30d4306ef69e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1226.801694] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfd92422-cd41-4dfc-a9ff-7ad86680d51f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1226.815837] env[60764]: DEBUG nova.compute.provider_tree [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1226.824639] env[60764]: DEBUG nova.scheduler.client.report [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1226.838590] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.402s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1226.839103] env[60764]: DEBUG nova.compute.manager [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1226.895037] env[60764]: DEBUG nova.compute.utils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1226.896058] env[60764]: DEBUG nova.compute.manager [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1226.897030] env[60764]: DEBUG nova.network.neutron [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1226.907409] env[60764]: DEBUG nova.compute.manager [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1226.966866] env[60764]: DEBUG nova.policy [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b9fea6fb06b845108ba49f93226ccd09', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8ca812a12f2549ab981fe8c2a003d01e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1226.970038] env[60764]: DEBUG nova.compute.manager [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1226.994434] env[60764]: DEBUG nova.virt.hardware [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1226.994722] env[60764]: DEBUG nova.virt.hardware [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1226.995911] env[60764]: DEBUG nova.virt.hardware [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1226.995911] env[60764]: DEBUG nova.virt.hardware [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1226.995911] env[60764]: DEBUG nova.virt.hardware [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1226.995911] env[60764]: DEBUG nova.virt.hardware [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1226.995911] env[60764]: DEBUG nova.virt.hardware [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1226.995911] env[60764]: DEBUG nova.virt.hardware [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1226.995911] env[60764]: DEBUG nova.virt.hardware [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1226.996217] env[60764]: DEBUG nova.virt.hardware [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1226.996343] env[60764]: DEBUG nova.virt.hardware [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1226.997375] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-152280e8-1e34-4ac8-a79c-09e9d3d54fb1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1227.005287] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2995ba49-bf26-4182-9b74-7f952433b4f7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1227.251814] env[60764]: DEBUG nova.network.neutron [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Successfully created port: b8806ba7-4db6-4687-ae1a-218d91385527 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1227.991118] env[60764]: DEBUG nova.network.neutron [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Successfully updated port: b8806ba7-4db6-4687-ae1a-218d91385527 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1228.002437] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Acquiring lock "refresh_cache-6397ff19-1385-4e38-b199-666394582582" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1228.002824] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Acquired lock "refresh_cache-6397ff19-1385-4e38-b199-666394582582" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1228.003091] env[60764]: DEBUG nova.network.neutron [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1228.047105] env[60764]: DEBUG nova.compute.manager [req-55be88fd-a814-46e2-9ca9-35700401e2ac req-2ceaa4df-92f7-44eb-96f7-0bf20f3b5877 service nova] [instance: 6397ff19-1385-4e38-b199-666394582582] Received event network-vif-plugged-b8806ba7-4db6-4687-ae1a-218d91385527 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1228.047338] env[60764]: DEBUG oslo_concurrency.lockutils [req-55be88fd-a814-46e2-9ca9-35700401e2ac req-2ceaa4df-92f7-44eb-96f7-0bf20f3b5877 service nova] Acquiring lock "6397ff19-1385-4e38-b199-666394582582-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1228.047545] env[60764]: DEBUG oslo_concurrency.lockutils [req-55be88fd-a814-46e2-9ca9-35700401e2ac req-2ceaa4df-92f7-44eb-96f7-0bf20f3b5877 service nova] Lock "6397ff19-1385-4e38-b199-666394582582-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1228.047705] env[60764]: DEBUG oslo_concurrency.lockutils [req-55be88fd-a814-46e2-9ca9-35700401e2ac req-2ceaa4df-92f7-44eb-96f7-0bf20f3b5877 service nova] Lock "6397ff19-1385-4e38-b199-666394582582-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1228.047871] env[60764]: DEBUG nova.compute.manager [req-55be88fd-a814-46e2-9ca9-35700401e2ac req-2ceaa4df-92f7-44eb-96f7-0bf20f3b5877 service nova] [instance: 6397ff19-1385-4e38-b199-666394582582] No waiting events found dispatching network-vif-plugged-b8806ba7-4db6-4687-ae1a-218d91385527 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1228.048064] env[60764]: WARNING nova.compute.manager [req-55be88fd-a814-46e2-9ca9-35700401e2ac req-2ceaa4df-92f7-44eb-96f7-0bf20f3b5877 service nova] [instance: 6397ff19-1385-4e38-b199-666394582582] Received unexpected event network-vif-plugged-b8806ba7-4db6-4687-ae1a-218d91385527 for instance with vm_state building and task_state spawning. [ 1228.048247] env[60764]: DEBUG nova.compute.manager [req-55be88fd-a814-46e2-9ca9-35700401e2ac req-2ceaa4df-92f7-44eb-96f7-0bf20f3b5877 service nova] [instance: 6397ff19-1385-4e38-b199-666394582582] Received event network-changed-b8806ba7-4db6-4687-ae1a-218d91385527 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1228.048417] env[60764]: DEBUG nova.compute.manager [req-55be88fd-a814-46e2-9ca9-35700401e2ac req-2ceaa4df-92f7-44eb-96f7-0bf20f3b5877 service nova] [instance: 6397ff19-1385-4e38-b199-666394582582] Refreshing instance network info cache due to event network-changed-b8806ba7-4db6-4687-ae1a-218d91385527. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1228.048610] env[60764]: DEBUG oslo_concurrency.lockutils [req-55be88fd-a814-46e2-9ca9-35700401e2ac req-2ceaa4df-92f7-44eb-96f7-0bf20f3b5877 service nova] Acquiring lock "refresh_cache-6397ff19-1385-4e38-b199-666394582582" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1228.055495] env[60764]: DEBUG nova.network.neutron [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1228.287363] env[60764]: DEBUG nova.network.neutron [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Updating instance_info_cache with network_info: [{"id": "b8806ba7-4db6-4687-ae1a-218d91385527", "address": "fa:16:3e:24:5a:87", "network": {"id": "61dbfc9b-6c39-478d-8814-6244010c9481", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1848041264-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8ca812a12f2549ab981fe8c2a003d01e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb8806ba7-4d", "ovs_interfaceid": "b8806ba7-4db6-4687-ae1a-218d91385527", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1228.300556] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Releasing lock "refresh_cache-6397ff19-1385-4e38-b199-666394582582" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1228.300842] env[60764]: DEBUG nova.compute.manager [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Instance network_info: |[{"id": "b8806ba7-4db6-4687-ae1a-218d91385527", "address": "fa:16:3e:24:5a:87", "network": {"id": "61dbfc9b-6c39-478d-8814-6244010c9481", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1848041264-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8ca812a12f2549ab981fe8c2a003d01e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb8806ba7-4d", "ovs_interfaceid": "b8806ba7-4db6-4687-ae1a-218d91385527", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1228.301183] env[60764]: DEBUG oslo_concurrency.lockutils [req-55be88fd-a814-46e2-9ca9-35700401e2ac req-2ceaa4df-92f7-44eb-96f7-0bf20f3b5877 service nova] Acquired lock "refresh_cache-6397ff19-1385-4e38-b199-666394582582" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1228.301378] env[60764]: DEBUG nova.network.neutron [req-55be88fd-a814-46e2-9ca9-35700401e2ac req-2ceaa4df-92f7-44eb-96f7-0bf20f3b5877 service nova] [instance: 6397ff19-1385-4e38-b199-666394582582] Refreshing network info cache for port b8806ba7-4db6-4687-ae1a-218d91385527 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1228.302375] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:24:5a:87', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '537e0890-4fa2-4f2d-b74c-49933a4edf53', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b8806ba7-4db6-4687-ae1a-218d91385527', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1228.309915] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Creating folder: Project (8ca812a12f2549ab981fe8c2a003d01e). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1228.311326] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1278f3e1-82d5-48a6-bb54-52510ed06bcd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1228.325091] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Created folder: Project (8ca812a12f2549ab981fe8c2a003d01e) in parent group-v449629. [ 1228.325275] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Creating folder: Instances. Parent ref: group-v449707. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1228.325499] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-221fbd93-730e-41bd-b8de-83cf813d410f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1228.334331] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Created folder: Instances in parent group-v449707. [ 1228.334559] env[60764]: DEBUG oslo.service.loopingcall [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1228.334727] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6397ff19-1385-4e38-b199-666394582582] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1228.334909] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b29a21d5-55e2-4f02-86ee-91f300a94e3a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1228.354721] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1228.354721] env[60764]: value = "task-2204967" [ 1228.354721] env[60764]: _type = "Task" [ 1228.354721] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1228.361927] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204967, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1228.554951] env[60764]: DEBUG nova.network.neutron [req-55be88fd-a814-46e2-9ca9-35700401e2ac req-2ceaa4df-92f7-44eb-96f7-0bf20f3b5877 service nova] [instance: 6397ff19-1385-4e38-b199-666394582582] Updated VIF entry in instance network info cache for port b8806ba7-4db6-4687-ae1a-218d91385527. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1228.555330] env[60764]: DEBUG nova.network.neutron [req-55be88fd-a814-46e2-9ca9-35700401e2ac req-2ceaa4df-92f7-44eb-96f7-0bf20f3b5877 service nova] [instance: 6397ff19-1385-4e38-b199-666394582582] Updating instance_info_cache with network_info: [{"id": "b8806ba7-4db6-4687-ae1a-218d91385527", "address": "fa:16:3e:24:5a:87", "network": {"id": "61dbfc9b-6c39-478d-8814-6244010c9481", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1848041264-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8ca812a12f2549ab981fe8c2a003d01e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb8806ba7-4d", "ovs_interfaceid": "b8806ba7-4db6-4687-ae1a-218d91385527", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1228.565455] env[60764]: DEBUG oslo_concurrency.lockutils [req-55be88fd-a814-46e2-9ca9-35700401e2ac req-2ceaa4df-92f7-44eb-96f7-0bf20f3b5877 service nova] Releasing lock "refresh_cache-6397ff19-1385-4e38-b199-666394582582" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1228.865020] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204967, 'name': CreateVM_Task, 'duration_secs': 0.315918} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1228.865020] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6397ff19-1385-4e38-b199-666394582582] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1228.865275] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1228.865275] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1228.865566] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1228.865813] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-da7bdb6b-baf4-4d07-8222-8f36b95969f1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1228.870196] env[60764]: DEBUG oslo_vmware.api [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Waiting for the task: (returnval){ [ 1228.870196] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52704a91-1188-85ad-eddc-9a4f930e7905" [ 1228.870196] env[60764]: _type = "Task" [ 1228.870196] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1228.877845] env[60764]: DEBUG oslo_vmware.api [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52704a91-1188-85ad-eddc-9a4f930e7905, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1229.381054] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1229.381337] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1229.381537] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1243.479524] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Acquiring lock "bf522599-8aa5-411a-96dd-8bd8328d9156" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1243.479852] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Lock "bf522599-8aa5-411a-96dd-8bd8328d9156" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1275.094242] env[60764]: WARNING oslo_vmware.rw_handles [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1275.094242] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1275.094242] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1275.094242] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1275.094242] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1275.094242] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1275.094242] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1275.094242] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1275.094242] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1275.094242] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1275.094242] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1275.094242] env[60764]: ERROR oslo_vmware.rw_handles [ 1275.094242] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/29efc658-4717-4bef-9943-960e5220b3fe/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1275.094242] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1275.094242] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Copying Virtual Disk [datastore2] vmware_temp/29efc658-4717-4bef-9943-960e5220b3fe/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/29efc658-4717-4bef-9943-960e5220b3fe/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1275.094242] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a91579dd-98f9-4c93-b2f3-8632dc909b92 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1275.094242] env[60764]: DEBUG oslo_vmware.api [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Waiting for the task: (returnval){ [ 1275.094242] env[60764]: value = "task-2204968" [ 1275.094242] env[60764]: _type = "Task" [ 1275.094242] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1275.094242] env[60764]: DEBUG oslo_vmware.api [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Task: {'id': task-2204968, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1275.591510] env[60764]: DEBUG oslo_vmware.exceptions [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1275.591799] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1275.592424] env[60764]: ERROR nova.compute.manager [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1275.592424] env[60764]: Faults: ['InvalidArgument'] [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Traceback (most recent call last): [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] yield resources [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] self.driver.spawn(context, instance, image_meta, [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] self._fetch_image_if_missing(context, vi) [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] image_cache(vi, tmp_image_ds_loc) [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] vm_util.copy_virtual_disk( [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] session._wait_for_task(vmdk_copy_task) [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] return self.wait_for_task(task_ref) [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] return evt.wait() [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] result = hub.switch() [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] return self.greenlet.switch() [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] self.f(*self.args, **self.kw) [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] raise exceptions.translate_fault(task_info.error) [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Faults: ['InvalidArgument'] [ 1275.592424] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] [ 1275.593447] env[60764]: INFO nova.compute.manager [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Terminating instance [ 1275.594413] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1275.594573] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1275.594810] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e17e904e-218e-48b2-b6ba-d6e5f5ea5296 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1275.597035] env[60764]: DEBUG nova.compute.manager [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1275.597232] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1275.597945] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6e6c516-7c9d-489a-b5bf-13279b94a42e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1275.604306] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1275.604536] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b312b9bd-9a78-4b3a-8353-02a164bcc970 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1275.606607] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1275.606779] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1275.607734] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-795ff85d-9491-4d2a-8643-cff2e94a478b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1275.612484] env[60764]: DEBUG oslo_vmware.api [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Waiting for the task: (returnval){ [ 1275.612484] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]520dc8de-c94b-2efa-479f-5d7a28899297" [ 1275.612484] env[60764]: _type = "Task" [ 1275.612484] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1275.619445] env[60764]: DEBUG oslo_vmware.api [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]520dc8de-c94b-2efa-479f-5d7a28899297, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1275.676690] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1275.676973] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1275.677182] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Deleting the datastore file [datastore2] 6af62a04-4a13-46c7-a0b2-28768c789f23 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1275.677429] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8bc6381e-5892-4fdc-90a1-0e057624a39f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1275.683328] env[60764]: DEBUG oslo_vmware.api [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Waiting for the task: (returnval){ [ 1275.683328] env[60764]: value = "task-2204970" [ 1275.683328] env[60764]: _type = "Task" [ 1275.683328] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1275.690786] env[60764]: DEBUG oslo_vmware.api [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Task: {'id': task-2204970, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1276.122752] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1276.123035] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Creating directory with path [datastore2] vmware_temp/9821b06a-08b5-4214-bad4-45fc5398689b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1276.123282] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5c8c011b-1285-40d1-ac25-9784ee6820a2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1276.135102] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Created directory with path [datastore2] vmware_temp/9821b06a-08b5-4214-bad4-45fc5398689b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1276.135326] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Fetch image to [datastore2] vmware_temp/9821b06a-08b5-4214-bad4-45fc5398689b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1276.135494] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/9821b06a-08b5-4214-bad4-45fc5398689b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1276.136294] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a245bb6-5e28-48da-9aa3-40bd918eaaa1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1276.142918] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-420e5a51-bb18-444a-85c9-66f68ff58bce {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1276.153175] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bc201a9-dc5f-461a-b4f7-e8743a08f673 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1276.183583] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a1b5caf-162d-4199-af1f-16c6d3575b43 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1276.194891] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-43d0e3b6-5e04-497c-af82-fdc1c9fc49d2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1276.196622] env[60764]: DEBUG oslo_vmware.api [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Task: {'id': task-2204970, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077507} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1276.196877] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1276.197071] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1276.197245] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1276.197411] env[60764]: INFO nova.compute.manager [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1276.199899] env[60764]: DEBUG nova.compute.claims [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1276.200089] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1276.200313] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1276.217165] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1276.267309] env[60764]: DEBUG oslo_vmware.rw_handles [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9821b06a-08b5-4214-bad4-45fc5398689b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1276.330178] env[60764]: DEBUG oslo_vmware.rw_handles [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1276.330381] env[60764]: DEBUG oslo_vmware.rw_handles [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9821b06a-08b5-4214-bad4-45fc5398689b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1276.549368] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5104b27-3a80-4100-b91f-b1edbff19773 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1276.557785] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83c75ff5-849a-4c54-9b65-445660db4c63 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1276.588344] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7ad4961-3476-4ae2-8947-0f591ece9946 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1276.595207] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-730c0756-10aa-4af4-8b38-e908c5d274db {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1276.607914] env[60764]: DEBUG nova.compute.provider_tree [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1276.618013] env[60764]: DEBUG nova.scheduler.client.report [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1276.634662] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.434s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1276.635199] env[60764]: ERROR nova.compute.manager [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1276.635199] env[60764]: Faults: ['InvalidArgument'] [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Traceback (most recent call last): [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] self.driver.spawn(context, instance, image_meta, [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] self._fetch_image_if_missing(context, vi) [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] image_cache(vi, tmp_image_ds_loc) [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] vm_util.copy_virtual_disk( [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] session._wait_for_task(vmdk_copy_task) [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] return self.wait_for_task(task_ref) [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] return evt.wait() [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] result = hub.switch() [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] return self.greenlet.switch() [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] self.f(*self.args, **self.kw) [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] raise exceptions.translate_fault(task_info.error) [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Faults: ['InvalidArgument'] [ 1276.635199] env[60764]: ERROR nova.compute.manager [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] [ 1276.635898] env[60764]: DEBUG nova.compute.utils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1276.637333] env[60764]: DEBUG nova.compute.manager [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Build of instance 6af62a04-4a13-46c7-a0b2-28768c789f23 was re-scheduled: A specified parameter was not correct: fileType [ 1276.637333] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1276.637748] env[60764]: DEBUG nova.compute.manager [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1276.637937] env[60764]: DEBUG nova.compute.manager [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1276.638124] env[60764]: DEBUG nova.compute.manager [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1276.638288] env[60764]: DEBUG nova.network.neutron [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1276.956876] env[60764]: DEBUG nova.network.neutron [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1276.973062] env[60764]: INFO nova.compute.manager [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Took 0.33 seconds to deallocate network for instance. [ 1277.075886] env[60764]: INFO nova.scheduler.client.report [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Deleted allocations for instance 6af62a04-4a13-46c7-a0b2-28768c789f23 [ 1277.098095] env[60764]: DEBUG oslo_concurrency.lockutils [None req-eefeb672-597d-41f8-977c-0ff42263e33e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Lock "6af62a04-4a13-46c7-a0b2-28768c789f23" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 678.631s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1277.099398] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Lock "6af62a04-4a13-46c7-a0b2-28768c789f23" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 481.310s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1277.099615] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Acquiring lock "6af62a04-4a13-46c7-a0b2-28768c789f23-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1277.099831] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Lock "6af62a04-4a13-46c7-a0b2-28768c789f23-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1277.101137] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Lock "6af62a04-4a13-46c7-a0b2-28768c789f23-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1277.103026] env[60764]: INFO nova.compute.manager [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Terminating instance [ 1277.104232] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Acquiring lock "refresh_cache-6af62a04-4a13-46c7-a0b2-28768c789f23" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1277.104385] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Acquired lock "refresh_cache-6af62a04-4a13-46c7-a0b2-28768c789f23" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1277.104640] env[60764]: DEBUG nova.network.neutron [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1277.113951] env[60764]: DEBUG nova.compute.manager [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1277.135175] env[60764]: DEBUG nova.network.neutron [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1277.177320] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1277.177320] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1277.178226] env[60764]: INFO nova.compute.claims [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1277.284580] env[60764]: DEBUG nova.network.neutron [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1277.295087] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Releasing lock "refresh_cache-6af62a04-4a13-46c7-a0b2-28768c789f23" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1277.295481] env[60764]: DEBUG nova.compute.manager [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1277.295696] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1277.296236] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-44e99a00-e0cc-485d-bc03-1de24f872ca4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1277.305936] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dabc134d-6e2a-4c91-8ed5-cfbdca2fd2c5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1277.337192] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1277.337371] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1277.337496] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1277.339152] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6af62a04-4a13-46c7-a0b2-28768c789f23 could not be found. [ 1277.339352] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1277.339523] env[60764]: INFO nova.compute.manager [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1277.339756] env[60764]: DEBUG oslo.service.loopingcall [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1277.343423] env[60764]: DEBUG nova.compute.manager [-] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1277.343423] env[60764]: DEBUG nova.network.neutron [-] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1277.362737] env[60764]: DEBUG nova.network.neutron [-] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1277.366152] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1277.366307] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1277.366439] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1277.366564] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1277.366710] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1277.366835] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1277.366952] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1277.367084] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1277.367203] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 6397ff19-1385-4e38-b199-666394582582] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1277.367316] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a3199f59-f827-404e-8272-296129096180] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1277.367434] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1277.372165] env[60764]: DEBUG nova.network.neutron [-] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1277.380375] env[60764]: INFO nova.compute.manager [-] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] Took 0.04 seconds to deallocate network for instance. [ 1277.477645] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f58c75fd-a38f-4ab0-8e9a-5760f17f215e tempest-ServersTestFqdnHostnames-179232180 tempest-ServersTestFqdnHostnames-179232180-project-member] Lock "6af62a04-4a13-46c7-a0b2-28768c789f23" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.378s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1277.478635] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "6af62a04-4a13-46c7-a0b2-28768c789f23" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 181.102s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1277.478690] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 6af62a04-4a13-46c7-a0b2-28768c789f23] During sync_power_state the instance has a pending task (deleting). Skip. [ 1277.478854] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "6af62a04-4a13-46c7-a0b2-28768c789f23" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1277.540631] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45ea0a0d-e490-435d-83f8-6979b0457c7d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1277.547231] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b99f7d0-f082-41cb-b563-ad88109ef3cc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1277.581036] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac2fee65-9a09-42c5-8035-6ca5d52aef4c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1277.588349] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba5469fc-4f5f-44c4-b5a4-816b5d2613ce {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1277.601655] env[60764]: DEBUG nova.compute.provider_tree [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1277.611284] env[60764]: DEBUG nova.scheduler.client.report [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1277.627205] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.451s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1277.627749] env[60764]: DEBUG nova.compute.manager [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1277.661691] env[60764]: DEBUG nova.compute.utils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1277.662902] env[60764]: DEBUG nova.compute.manager [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1277.663087] env[60764]: DEBUG nova.network.neutron [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1277.673228] env[60764]: DEBUG nova.compute.manager [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1278.395701] env[60764]: DEBUG nova.policy [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8165c7e326c4016a42ba39f68abfce6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5ed1c9589f44a86909b417fac99dab5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1278.397638] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1278.398054] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1278.411050] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1278.411261] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1278.411419] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1278.411569] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1278.412836] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73d50e80-e9a4-4f0b-a907-2d3355fa5fe2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1278.421468] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6de397f8-48ca-496a-a1c0-b97f796c397c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1278.428103] env[60764]: DEBUG nova.compute.manager [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1278.438169] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-940a0444-8987-4ba2-a8b0-c44e0179bed0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1278.445494] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4b70678-c9dd-4a64-9164-b5c293564a5d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1278.474719] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181261MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1278.474909] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1278.475123] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1278.486713] env[60764]: DEBUG nova.virt.hardware [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1278.486840] env[60764]: DEBUG nova.virt.hardware [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1278.489021] env[60764]: DEBUG nova.virt.hardware [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1278.489021] env[60764]: DEBUG nova.virt.hardware [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1278.489021] env[60764]: DEBUG nova.virt.hardware [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1278.489021] env[60764]: DEBUG nova.virt.hardware [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1278.489021] env[60764]: DEBUG nova.virt.hardware [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1278.489021] env[60764]: DEBUG nova.virt.hardware [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1278.489021] env[60764]: DEBUG nova.virt.hardware [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1278.489021] env[60764]: DEBUG nova.virt.hardware [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1278.489021] env[60764]: DEBUG nova.virt.hardware [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1278.489589] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7b2bb87-d1c6-4ad7-836f-0c428b8252d5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1278.497814] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23be66a1-9071-4143-904f-e6d2bc2fe6d1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1278.561516] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7a843233-c56c-4d87-aeb0-2ffaa441b021 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1278.561675] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 74b4bba7-8568-4fc4-a744-395a3271abc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1278.561804] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance dfd3e3af-90c9-420b-81ec-e9115c519016 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1278.561929] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e80dd396-f709-48d7-bc98-159b175f5593 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1278.562183] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ce8f8161-623c-4f88-8846-8f3b5a4ceabe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1278.562386] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aad42e7f-24c2-400e-8a1c-6baae2081e29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1278.562519] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1278.562667] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b3ca6987-3415-4db5-a514-cd66c342eb7f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1278.562783] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6397ff19-1385-4e38-b199-666394582582 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1278.562906] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a3199f59-f827-404e-8272-296129096180 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1278.577524] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c645f7f5-528b-4719-96dd-8e50a46b4261 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1278.589651] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 9debc548-4034-4ba7-93a4-915fc6ad3229 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1278.600109] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c8ddfd42-c132-48e3-bade-f103a1bdea07 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1278.609942] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 8b2206e9-2b46-44f8-a756-80be988926a4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1278.620407] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 94b4adb4-6119-489a-820e-701790136809 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1278.631107] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b0b63493-2864-4767-a20c-83db66f395c6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1278.640965] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f1e869e4-179c-4ea1-9a50-c560e9d2f78b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1278.651243] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c32c50ad-0818-478e-8cfa-c34902153a2c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1278.663032] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 51512549-4c6e-41d4-98b0-7d1e801a8b69 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1278.673573] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 8d616729-866d-4ebb-8bc7-cf2172b70382 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1278.684256] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 5ec9deb5-ad1c-4908-afc8-8e72f8b0cb85 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1278.694038] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance bf522599-8aa5-411a-96dd-8bd8328d9156 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1278.694285] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1278.694428] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1278.790899] env[60764]: DEBUG nova.network.neutron [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Successfully created port: 8256eed9-136d-4044-b31a-7cb578a1f2b3 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1279.012416] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e52f1e27-f7ee-4ded-8ef9-c2ecb53587b8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.020580] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29665ebe-3bf1-463b-9112-11f451b7669a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.049991] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3466f69e-adcb-46ae-a19c-f1c032278ad0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.056976] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea792985-5f0f-468f-961f-c969934c2e42 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.070583] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1279.081209] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1279.095704] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1279.095704] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.620s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1279.361037] env[60764]: DEBUG nova.compute.manager [req-75dad4af-e8e5-4be3-999f-0c21c108f02f req-47c65d1d-8ca2-4346-ace2-cf93a31c46f7 service nova] [instance: a3199f59-f827-404e-8272-296129096180] Received event network-vif-plugged-8256eed9-136d-4044-b31a-7cb578a1f2b3 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1279.361037] env[60764]: DEBUG oslo_concurrency.lockutils [req-75dad4af-e8e5-4be3-999f-0c21c108f02f req-47c65d1d-8ca2-4346-ace2-cf93a31c46f7 service nova] Acquiring lock "a3199f59-f827-404e-8272-296129096180-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1279.361037] env[60764]: DEBUG oslo_concurrency.lockutils [req-75dad4af-e8e5-4be3-999f-0c21c108f02f req-47c65d1d-8ca2-4346-ace2-cf93a31c46f7 service nova] Lock "a3199f59-f827-404e-8272-296129096180-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1279.361037] env[60764]: DEBUG oslo_concurrency.lockutils [req-75dad4af-e8e5-4be3-999f-0c21c108f02f req-47c65d1d-8ca2-4346-ace2-cf93a31c46f7 service nova] Lock "a3199f59-f827-404e-8272-296129096180-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1279.361037] env[60764]: DEBUG nova.compute.manager [req-75dad4af-e8e5-4be3-999f-0c21c108f02f req-47c65d1d-8ca2-4346-ace2-cf93a31c46f7 service nova] [instance: a3199f59-f827-404e-8272-296129096180] No waiting events found dispatching network-vif-plugged-8256eed9-136d-4044-b31a-7cb578a1f2b3 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1279.361037] env[60764]: WARNING nova.compute.manager [req-75dad4af-e8e5-4be3-999f-0c21c108f02f req-47c65d1d-8ca2-4346-ace2-cf93a31c46f7 service nova] [instance: a3199f59-f827-404e-8272-296129096180] Received unexpected event network-vif-plugged-8256eed9-136d-4044-b31a-7cb578a1f2b3 for instance with vm_state building and task_state spawning. [ 1279.452073] env[60764]: DEBUG nova.network.neutron [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Successfully updated port: 8256eed9-136d-4044-b31a-7cb578a1f2b3 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1279.466763] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "refresh_cache-a3199f59-f827-404e-8272-296129096180" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1279.466763] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquired lock "refresh_cache-a3199f59-f827-404e-8272-296129096180" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1279.466763] env[60764]: DEBUG nova.network.neutron [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1279.527392] env[60764]: DEBUG nova.network.neutron [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1279.660460] env[60764]: DEBUG oslo_concurrency.lockutils [None req-85dd887a-e89f-4571-975b-c28e4ea92f37 tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Acquiring lock "6397ff19-1385-4e38-b199-666394582582" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1279.795604] env[60764]: DEBUG nova.network.neutron [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Updating instance_info_cache with network_info: [{"id": "8256eed9-136d-4044-b31a-7cb578a1f2b3", "address": "fa:16:3e:9d:ba:7a", "network": {"id": "6fca304c-0605-4df1-816c-6a41d3a44163", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1810377619-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b5ed1c9589f44a86909b417fac99dab5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "32463b6d-4569-4755-8a29-873a028690a7", "external-id": "nsx-vlan-transportzone-349", "segmentation_id": 349, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8256eed9-13", "ovs_interfaceid": "8256eed9-136d-4044-b31a-7cb578a1f2b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1279.815925] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Releasing lock "refresh_cache-a3199f59-f827-404e-8272-296129096180" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1279.816282] env[60764]: DEBUG nova.compute.manager [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Instance network_info: |[{"id": "8256eed9-136d-4044-b31a-7cb578a1f2b3", "address": "fa:16:3e:9d:ba:7a", "network": {"id": "6fca304c-0605-4df1-816c-6a41d3a44163", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1810377619-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b5ed1c9589f44a86909b417fac99dab5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "32463b6d-4569-4755-8a29-873a028690a7", "external-id": "nsx-vlan-transportzone-349", "segmentation_id": 349, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8256eed9-13", "ovs_interfaceid": "8256eed9-136d-4044-b31a-7cb578a1f2b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1279.816688] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9d:ba:7a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '32463b6d-4569-4755-8a29-873a028690a7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8256eed9-136d-4044-b31a-7cb578a1f2b3', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1279.824323] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Creating folder: Project (b5ed1c9589f44a86909b417fac99dab5). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1279.824891] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-49f4b583-7f5a-4181-898e-7b5682bd95b8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.836663] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Created folder: Project (b5ed1c9589f44a86909b417fac99dab5) in parent group-v449629. [ 1279.836886] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Creating folder: Instances. Parent ref: group-v449710. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1279.837213] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a0ed875e-ec7f-434b-94f9-0214ebe832f2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.846501] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Created folder: Instances in parent group-v449710. [ 1279.846680] env[60764]: DEBUG oslo.service.loopingcall [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1279.846923] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a3199f59-f827-404e-8272-296129096180] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1279.847175] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f97f9085-72b0-41a4-b799-c0ada326a8e5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.866594] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1279.866594] env[60764]: value = "task-2204973" [ 1279.866594] env[60764]: _type = "Task" [ 1279.866594] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1279.873772] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204973, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1280.377419] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204973, 'name': CreateVM_Task, 'duration_secs': 0.298861} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1280.378249] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a3199f59-f827-404e-8272-296129096180] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1280.378385] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1280.378444] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1280.378751] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1280.379009] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-48c1aeb3-9028-495a-b422-4cd0afa0ae8f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1280.383349] env[60764]: DEBUG oslo_vmware.api [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for the task: (returnval){ [ 1280.383349] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]527c7a33-b3c6-b859-9614-0297b3151046" [ 1280.383349] env[60764]: _type = "Task" [ 1280.383349] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1280.390967] env[60764]: DEBUG oslo_vmware.api [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]527c7a33-b3c6-b859-9614-0297b3151046, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1280.893644] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1280.893972] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1280.894146] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1281.027977] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1281.331165] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1281.332348] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1281.399215] env[60764]: DEBUG nova.compute.manager [req-40f4e80a-1a18-405e-892f-86ba3973679e req-71333062-d415-4e3e-8f4b-051345cb9cc5 service nova] [instance: a3199f59-f827-404e-8272-296129096180] Received event network-changed-8256eed9-136d-4044-b31a-7cb578a1f2b3 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1281.399215] env[60764]: DEBUG nova.compute.manager [req-40f4e80a-1a18-405e-892f-86ba3973679e req-71333062-d415-4e3e-8f4b-051345cb9cc5 service nova] [instance: a3199f59-f827-404e-8272-296129096180] Refreshing instance network info cache due to event network-changed-8256eed9-136d-4044-b31a-7cb578a1f2b3. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1281.399215] env[60764]: DEBUG oslo_concurrency.lockutils [req-40f4e80a-1a18-405e-892f-86ba3973679e req-71333062-d415-4e3e-8f4b-051345cb9cc5 service nova] Acquiring lock "refresh_cache-a3199f59-f827-404e-8272-296129096180" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1281.399809] env[60764]: DEBUG oslo_concurrency.lockutils [req-40f4e80a-1a18-405e-892f-86ba3973679e req-71333062-d415-4e3e-8f4b-051345cb9cc5 service nova] Acquired lock "refresh_cache-a3199f59-f827-404e-8272-296129096180" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1281.400164] env[60764]: DEBUG nova.network.neutron [req-40f4e80a-1a18-405e-892f-86ba3973679e req-71333062-d415-4e3e-8f4b-051345cb9cc5 service nova] [instance: a3199f59-f827-404e-8272-296129096180] Refreshing network info cache for port 8256eed9-136d-4044-b31a-7cb578a1f2b3 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1281.926049] env[60764]: DEBUG nova.network.neutron [req-40f4e80a-1a18-405e-892f-86ba3973679e req-71333062-d415-4e3e-8f4b-051345cb9cc5 service nova] [instance: a3199f59-f827-404e-8272-296129096180] Updated VIF entry in instance network info cache for port 8256eed9-136d-4044-b31a-7cb578a1f2b3. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1281.926415] env[60764]: DEBUG nova.network.neutron [req-40f4e80a-1a18-405e-892f-86ba3973679e req-71333062-d415-4e3e-8f4b-051345cb9cc5 service nova] [instance: a3199f59-f827-404e-8272-296129096180] Updating instance_info_cache with network_info: [{"id": "8256eed9-136d-4044-b31a-7cb578a1f2b3", "address": "fa:16:3e:9d:ba:7a", "network": {"id": "6fca304c-0605-4df1-816c-6a41d3a44163", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1810377619-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b5ed1c9589f44a86909b417fac99dab5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "32463b6d-4569-4755-8a29-873a028690a7", "external-id": "nsx-vlan-transportzone-349", "segmentation_id": 349, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8256eed9-13", "ovs_interfaceid": "8256eed9-136d-4044-b31a-7cb578a1f2b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1281.936659] env[60764]: DEBUG oslo_concurrency.lockutils [req-40f4e80a-1a18-405e-892f-86ba3973679e req-71333062-d415-4e3e-8f4b-051345cb9cc5 service nova] Releasing lock "refresh_cache-a3199f59-f827-404e-8272-296129096180" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1283.326278] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1283.329967] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1283.331018] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1285.200863] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Acquiring lock "3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1285.201169] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Lock "3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1285.329941] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1286.033457] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c4245564-c180-4abd-87bf-dedf5a80c91e tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "a3199f59-f827-404e-8272-296129096180" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1325.087912] env[60764]: WARNING oslo_vmware.rw_handles [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1325.087912] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1325.087912] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1325.087912] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1325.087912] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1325.087912] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1325.087912] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1325.087912] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1325.087912] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1325.087912] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1325.087912] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1325.087912] env[60764]: ERROR oslo_vmware.rw_handles [ 1325.088679] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/9821b06a-08b5-4214-bad4-45fc5398689b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1325.091016] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1325.091285] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Copying Virtual Disk [datastore2] vmware_temp/9821b06a-08b5-4214-bad4-45fc5398689b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/9821b06a-08b5-4214-bad4-45fc5398689b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1325.091595] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a1458271-15bb-4e91-b924-14d37a284d2c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1325.099796] env[60764]: DEBUG oslo_vmware.api [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Waiting for the task: (returnval){ [ 1325.099796] env[60764]: value = "task-2204974" [ 1325.099796] env[60764]: _type = "Task" [ 1325.099796] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1325.107867] env[60764]: DEBUG oslo_vmware.api [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Task: {'id': task-2204974, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1325.609562] env[60764]: DEBUG oslo_vmware.exceptions [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1325.609852] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1325.610400] env[60764]: ERROR nova.compute.manager [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1325.610400] env[60764]: Faults: ['InvalidArgument'] [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Traceback (most recent call last): [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] yield resources [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self.driver.spawn(context, instance, image_meta, [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self._fetch_image_if_missing(context, vi) [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] image_cache(vi, tmp_image_ds_loc) [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] vm_util.copy_virtual_disk( [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] session._wait_for_task(vmdk_copy_task) [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] return self.wait_for_task(task_ref) [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] return evt.wait() [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] result = hub.switch() [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] return self.greenlet.switch() [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self.f(*self.args, **self.kw) [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] raise exceptions.translate_fault(task_info.error) [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Faults: ['InvalidArgument'] [ 1325.610400] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] [ 1325.611245] env[60764]: INFO nova.compute.manager [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Terminating instance [ 1325.612249] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1325.612460] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1325.612695] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4a97f053-2958-4e56-a5d1-c17b2781a5af {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1325.615073] env[60764]: DEBUG nova.compute.manager [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1325.615337] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1325.616074] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f36d451-abdb-41dd-b453-2b711d677730 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1325.622576] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1325.623510] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-fa242c83-9ee0-4793-9de9-0f9c7bdd561b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1325.624929] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1325.625106] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1325.625792] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6e9904b7-a9be-47c7-9b71-db28ddec7732 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1325.630916] env[60764]: DEBUG oslo_vmware.api [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Waiting for the task: (returnval){ [ 1325.630916] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]526b71e7-9b31-d1e6-8abb-68a814c1e236" [ 1325.630916] env[60764]: _type = "Task" [ 1325.630916] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1325.637496] env[60764]: DEBUG oslo_vmware.api [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]526b71e7-9b31-d1e6-8abb-68a814c1e236, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1325.700877] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1325.701154] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1325.701347] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Deleting the datastore file [datastore2] 7a843233-c56c-4d87-aeb0-2ffaa441b021 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1325.701621] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-481b6f9d-1d09-4c3c-a922-848ef52423cf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1325.708328] env[60764]: DEBUG oslo_vmware.api [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Waiting for the task: (returnval){ [ 1325.708328] env[60764]: value = "task-2204976" [ 1325.708328] env[60764]: _type = "Task" [ 1325.708328] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1325.716720] env[60764]: DEBUG oslo_vmware.api [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Task: {'id': task-2204976, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1326.141859] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1326.142152] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Creating directory with path [datastore2] vmware_temp/6411abe2-4ea0-4bdd-a955-004fd495d44a/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1326.142392] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1db64c6a-6fa4-4ea2-ac45-eccae24baa9f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1326.153727] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Created directory with path [datastore2] vmware_temp/6411abe2-4ea0-4bdd-a955-004fd495d44a/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1326.153915] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Fetch image to [datastore2] vmware_temp/6411abe2-4ea0-4bdd-a955-004fd495d44a/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1326.154100] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/6411abe2-4ea0-4bdd-a955-004fd495d44a/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1326.154825] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9291755-8e82-465e-8f21-24e1f576673a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1326.161401] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60cba56d-6ca8-4572-ac44-2595eec85bce {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1326.170247] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c478d22-dd7f-4367-a2c0-a30f9e22495b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1326.201597] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e2f76b9-0ff1-4824-b288-4074fb8bc3ba {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1326.207542] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e2ed45ea-d86a-4a1f-9bcd-ac7d49293cc5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1326.217423] env[60764]: DEBUG oslo_vmware.api [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Task: {'id': task-2204976, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066054} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1326.217667] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1326.217860] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1326.218023] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1326.218199] env[60764]: INFO nova.compute.manager [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1326.220444] env[60764]: DEBUG nova.compute.claims [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1326.220653] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1326.220905] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1326.232641] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1326.285541] env[60764]: DEBUG oslo_vmware.rw_handles [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6411abe2-4ea0-4bdd-a955-004fd495d44a/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1326.346213] env[60764]: DEBUG oslo_vmware.rw_handles [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1326.346463] env[60764]: DEBUG oslo_vmware.rw_handles [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6411abe2-4ea0-4bdd-a955-004fd495d44a/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1326.583091] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38a71b36-6d2f-4e90-b178-97d3542a86e0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1326.590895] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77969dd7-5234-4c9d-b076-78b846e7943c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1326.620961] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98636f03-0344-499a-8c3b-4d1429800d91 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1326.628044] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7fc0d78-6c73-4870-b10b-4052b85751bf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1326.640691] env[60764]: DEBUG nova.compute.provider_tree [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1326.649373] env[60764]: DEBUG nova.scheduler.client.report [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1326.662340] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.441s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1326.662860] env[60764]: ERROR nova.compute.manager [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1326.662860] env[60764]: Faults: ['InvalidArgument'] [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Traceback (most recent call last): [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self.driver.spawn(context, instance, image_meta, [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self._fetch_image_if_missing(context, vi) [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] image_cache(vi, tmp_image_ds_loc) [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] vm_util.copy_virtual_disk( [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] session._wait_for_task(vmdk_copy_task) [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] return self.wait_for_task(task_ref) [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] return evt.wait() [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] result = hub.switch() [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] return self.greenlet.switch() [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self.f(*self.args, **self.kw) [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] raise exceptions.translate_fault(task_info.error) [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Faults: ['InvalidArgument'] [ 1326.662860] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] [ 1326.663717] env[60764]: DEBUG nova.compute.utils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1326.665084] env[60764]: DEBUG nova.compute.manager [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Build of instance 7a843233-c56c-4d87-aeb0-2ffaa441b021 was re-scheduled: A specified parameter was not correct: fileType [ 1326.665084] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1326.665487] env[60764]: DEBUG nova.compute.manager [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1326.665662] env[60764]: DEBUG nova.compute.manager [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1326.665813] env[60764]: DEBUG nova.compute.manager [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1326.665973] env[60764]: DEBUG nova.network.neutron [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1326.971759] env[60764]: DEBUG neutronclient.v2_0.client [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60764) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1326.973842] env[60764]: ERROR nova.compute.manager [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Traceback (most recent call last): [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self.driver.spawn(context, instance, image_meta, [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self._fetch_image_if_missing(context, vi) [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] image_cache(vi, tmp_image_ds_loc) [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] vm_util.copy_virtual_disk( [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] session._wait_for_task(vmdk_copy_task) [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] return self.wait_for_task(task_ref) [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] return evt.wait() [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] result = hub.switch() [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] return self.greenlet.switch() [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self.f(*self.args, **self.kw) [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] raise exceptions.translate_fault(task_info.error) [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Faults: ['InvalidArgument'] [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] During handling of the above exception, another exception occurred: [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Traceback (most recent call last): [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self._build_and_run_instance(context, instance, image, [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] raise exception.RescheduledException( [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] nova.exception.RescheduledException: Build of instance 7a843233-c56c-4d87-aeb0-2ffaa441b021 was re-scheduled: A specified parameter was not correct: fileType [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Faults: ['InvalidArgument'] [ 1326.973842] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] During handling of the above exception, another exception occurred: [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Traceback (most recent call last): [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] ret = obj(*args, **kwargs) [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] exception_handler_v20(status_code, error_body) [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] raise client_exc(message=error_message, [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Neutron server returns request_ids: ['req-3b59dcfc-e1dd-4a54-8144-a6a2285f2082'] [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] During handling of the above exception, another exception occurred: [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Traceback (most recent call last): [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self._deallocate_network(context, instance, requested_networks) [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self.network_api.deallocate_for_instance( [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] data = neutron.list_ports(**search_opts) [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] ret = obj(*args, **kwargs) [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] return self.list('ports', self.ports_path, retrieve_all, [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] ret = obj(*args, **kwargs) [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] for r in self._pagination(collection, path, **params): [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] res = self.get(path, params=params) [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] ret = obj(*args, **kwargs) [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] return self.retry_request("GET", action, body=body, [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] ret = obj(*args, **kwargs) [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1326.974939] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] return self.do_request(method, action, body=body, [ 1326.976068] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1326.976068] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] ret = obj(*args, **kwargs) [ 1326.976068] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1326.976068] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self._handle_fault_response(status_code, replybody, resp) [ 1326.976068] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1326.976068] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] raise exception.Unauthorized() [ 1326.976068] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] nova.exception.Unauthorized: Not authorized. [ 1326.976068] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] [ 1327.028258] env[60764]: INFO nova.scheduler.client.report [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Deleted allocations for instance 7a843233-c56c-4d87-aeb0-2ffaa441b021 [ 1327.047515] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e072e9e3-a287-40fb-8f71-97b688e33d2e tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Lock "7a843233-c56c-4d87-aeb0-2ffaa441b021" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 633.381s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1327.048633] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0a8efb8-2afb-438c-9c5c-14ba336a4de4 tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Lock "7a843233-c56c-4d87-aeb0-2ffaa441b021" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 437.571s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1327.048847] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0a8efb8-2afb-438c-9c5c-14ba336a4de4 tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Acquiring lock "7a843233-c56c-4d87-aeb0-2ffaa441b021-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1327.049062] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0a8efb8-2afb-438c-9c5c-14ba336a4de4 tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Lock "7a843233-c56c-4d87-aeb0-2ffaa441b021-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1327.049231] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0a8efb8-2afb-438c-9c5c-14ba336a4de4 tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Lock "7a843233-c56c-4d87-aeb0-2ffaa441b021-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1327.051125] env[60764]: INFO nova.compute.manager [None req-f0a8efb8-2afb-438c-9c5c-14ba336a4de4 tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Terminating instance [ 1327.052834] env[60764]: DEBUG nova.compute.manager [None req-f0a8efb8-2afb-438c-9c5c-14ba336a4de4 tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1327.053040] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f0a8efb8-2afb-438c-9c5c-14ba336a4de4 tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1327.053495] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-586d654b-ac04-4a4c-9690-2150e15c0160 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1327.063906] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-947ab0d9-097c-4d48-83a7-a17cd2e4f5b1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1327.074134] env[60764]: DEBUG nova.compute.manager [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1327.097030] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-f0a8efb8-2afb-438c-9c5c-14ba336a4de4 tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7a843233-c56c-4d87-aeb0-2ffaa441b021 could not be found. [ 1327.097030] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f0a8efb8-2afb-438c-9c5c-14ba336a4de4 tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1327.097030] env[60764]: INFO nova.compute.manager [None req-f0a8efb8-2afb-438c-9c5c-14ba336a4de4 tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1327.097030] env[60764]: DEBUG oslo.service.loopingcall [None req-f0a8efb8-2afb-438c-9c5c-14ba336a4de4 tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1327.097030] env[60764]: DEBUG nova.compute.manager [-] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1327.097030] env[60764]: DEBUG nova.network.neutron [-] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1327.126931] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1327.126989] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1327.128529] env[60764]: INFO nova.compute.claims [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1327.232801] env[60764]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60764) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1327.233078] env[60764]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-26785e8b-118f-4e08-a6b2-3a693b38e0b4'] [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1327.233764] env[60764]: ERROR oslo.service.loopingcall [ 1327.236382] env[60764]: ERROR nova.compute.manager [None req-f0a8efb8-2afb-438c-9c5c-14ba336a4de4 tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1327.267026] env[60764]: ERROR nova.compute.manager [None req-f0a8efb8-2afb-438c-9c5c-14ba336a4de4 tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Traceback (most recent call last): [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] ret = obj(*args, **kwargs) [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] exception_handler_v20(status_code, error_body) [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] raise client_exc(message=error_message, [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Neutron server returns request_ids: ['req-26785e8b-118f-4e08-a6b2-3a693b38e0b4'] [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] During handling of the above exception, another exception occurred: [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Traceback (most recent call last): [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self._delete_instance(context, instance, bdms) [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self._shutdown_instance(context, instance, bdms) [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self._try_deallocate_network(context, instance, requested_networks) [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] with excutils.save_and_reraise_exception(): [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self.force_reraise() [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] raise self.value [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] _deallocate_network_with_retries() [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] return evt.wait() [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] result = hub.switch() [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] return self.greenlet.switch() [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] result = func(*self.args, **self.kw) [ 1327.267026] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] result = f(*args, **kwargs) [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self._deallocate_network( [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self.network_api.deallocate_for_instance( [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] data = neutron.list_ports(**search_opts) [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] ret = obj(*args, **kwargs) [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] return self.list('ports', self.ports_path, retrieve_all, [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] ret = obj(*args, **kwargs) [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] for r in self._pagination(collection, path, **params): [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] res = self.get(path, params=params) [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] ret = obj(*args, **kwargs) [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] return self.retry_request("GET", action, body=body, [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] ret = obj(*args, **kwargs) [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] return self.do_request(method, action, body=body, [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] ret = obj(*args, **kwargs) [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] self._handle_fault_response(status_code, replybody, resp) [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1327.268154] env[60764]: ERROR nova.compute.manager [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] [ 1327.297178] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0a8efb8-2afb-438c-9c5c-14ba336a4de4 tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Lock "7a843233-c56c-4d87-aeb0-2ffaa441b021" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.248s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1327.298300] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "7a843233-c56c-4d87-aeb0-2ffaa441b021" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 230.922s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1327.298475] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] During sync_power_state the instance has a pending task (deleting). Skip. [ 1327.299127] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "7a843233-c56c-4d87-aeb0-2ffaa441b021" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1327.352152] env[60764]: INFO nova.compute.manager [None req-f0a8efb8-2afb-438c-9c5c-14ba336a4de4 tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] [instance: 7a843233-c56c-4d87-aeb0-2ffaa441b021] Successfully reverted task state from None on failure for instance. [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server [None req-f0a8efb8-2afb-438c-9c5c-14ba336a4de4 tempest-ServerDiagnosticsNegativeTest-1512875039 tempest-ServerDiagnosticsNegativeTest-1512875039-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-26785e8b-118f-4e08-a6b2-3a693b38e0b4'] [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server raise self.value [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server raise self.value [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server raise self.value [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1327.356488] env[60764]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server raise self.value [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server raise self.value [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1327.357985] env[60764]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1327.359571] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1327.359571] env[60764]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1327.359571] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1327.359571] env[60764]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1327.359571] env[60764]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1327.359571] env[60764]: ERROR oslo_messaging.rpc.server [ 1327.447509] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e30664f-8645-4cbe-81f5-44e9e15e3173 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1327.454598] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ded0a8b4-3d32-4c2b-b457-833511e7ca9e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1327.485958] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f42bee9e-2d2e-4ebc-b23d-98ecc040a5d0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1327.493710] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa698590-f0ce-467c-80a0-8200ddaa240e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1327.509317] env[60764]: DEBUG nova.compute.provider_tree [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1327.517654] env[60764]: DEBUG nova.scheduler.client.report [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1327.532069] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.405s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1327.532563] env[60764]: DEBUG nova.compute.manager [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1327.566814] env[60764]: DEBUG nova.compute.utils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1327.568353] env[60764]: DEBUG nova.compute.manager [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1327.568532] env[60764]: DEBUG nova.network.neutron [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1327.596193] env[60764]: DEBUG nova.compute.manager [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1327.633087] env[60764]: DEBUG nova.policy [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '72051f4e68b049719e6faf2a31a92561', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8cd94049fe334cddb1283a0046e9ae48', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1327.667010] env[60764]: DEBUG nova.compute.manager [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1327.692067] env[60764]: DEBUG nova.virt.hardware [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1327.692318] env[60764]: DEBUG nova.virt.hardware [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1327.692477] env[60764]: DEBUG nova.virt.hardware [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1327.692654] env[60764]: DEBUG nova.virt.hardware [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1327.692800] env[60764]: DEBUG nova.virt.hardware [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1327.692946] env[60764]: DEBUG nova.virt.hardware [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1327.693168] env[60764]: DEBUG nova.virt.hardware [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1327.693326] env[60764]: DEBUG nova.virt.hardware [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1327.693496] env[60764]: DEBUG nova.virt.hardware [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1327.693655] env[60764]: DEBUG nova.virt.hardware [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1327.693823] env[60764]: DEBUG nova.virt.hardware [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1327.694692] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f551866-0f32-4f8d-99f7-49ae73646c90 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1327.702708] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e782de3c-3bd0-4306-864b-5a0a3b040567 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1327.966839] env[60764]: DEBUG nova.network.neutron [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Successfully created port: 5f4dbdce-acfb-4570-8976-322bc69bfa8d {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1328.617526] env[60764]: DEBUG nova.compute.manager [req-1f60d1d6-c8ef-41a6-bb01-c7e55ed47494 req-dfcd07e9-25d0-4359-8d83-1609094b65c3 service nova] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Received event network-vif-plugged-5f4dbdce-acfb-4570-8976-322bc69bfa8d {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1328.617799] env[60764]: DEBUG oslo_concurrency.lockutils [req-1f60d1d6-c8ef-41a6-bb01-c7e55ed47494 req-dfcd07e9-25d0-4359-8d83-1609094b65c3 service nova] Acquiring lock "c645f7f5-528b-4719-96dd-8e50a46b4261-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1328.617963] env[60764]: DEBUG oslo_concurrency.lockutils [req-1f60d1d6-c8ef-41a6-bb01-c7e55ed47494 req-dfcd07e9-25d0-4359-8d83-1609094b65c3 service nova] Lock "c645f7f5-528b-4719-96dd-8e50a46b4261-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1328.618208] env[60764]: DEBUG oslo_concurrency.lockutils [req-1f60d1d6-c8ef-41a6-bb01-c7e55ed47494 req-dfcd07e9-25d0-4359-8d83-1609094b65c3 service nova] Lock "c645f7f5-528b-4719-96dd-8e50a46b4261-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1328.618300] env[60764]: DEBUG nova.compute.manager [req-1f60d1d6-c8ef-41a6-bb01-c7e55ed47494 req-dfcd07e9-25d0-4359-8d83-1609094b65c3 service nova] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] No waiting events found dispatching network-vif-plugged-5f4dbdce-acfb-4570-8976-322bc69bfa8d {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1328.618445] env[60764]: WARNING nova.compute.manager [req-1f60d1d6-c8ef-41a6-bb01-c7e55ed47494 req-dfcd07e9-25d0-4359-8d83-1609094b65c3 service nova] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Received unexpected event network-vif-plugged-5f4dbdce-acfb-4570-8976-322bc69bfa8d for instance with vm_state building and task_state spawning. [ 1328.704373] env[60764]: DEBUG nova.network.neutron [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Successfully updated port: 5f4dbdce-acfb-4570-8976-322bc69bfa8d {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1328.720301] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquiring lock "refresh_cache-c645f7f5-528b-4719-96dd-8e50a46b4261" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1328.720972] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquired lock "refresh_cache-c645f7f5-528b-4719-96dd-8e50a46b4261" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1328.720972] env[60764]: DEBUG nova.network.neutron [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1328.755755] env[60764]: DEBUG nova.network.neutron [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1328.925915] env[60764]: DEBUG nova.network.neutron [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Updating instance_info_cache with network_info: [{"id": "5f4dbdce-acfb-4570-8976-322bc69bfa8d", "address": "fa:16:3e:bf:0c:61", "network": {"id": "90189b06-6aae-49a6-aa89-ec0c32b73181", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1482896272-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8cd94049fe334cddb1283a0046e9ae48", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46785c9c-8b22-487d-a854-b3e67c5ed1d7", "external-id": "nsx-vlan-transportzone-430", "segmentation_id": 430, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5f4dbdce-ac", "ovs_interfaceid": "5f4dbdce-acfb-4570-8976-322bc69bfa8d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1328.936929] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Releasing lock "refresh_cache-c645f7f5-528b-4719-96dd-8e50a46b4261" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1328.937222] env[60764]: DEBUG nova.compute.manager [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Instance network_info: |[{"id": "5f4dbdce-acfb-4570-8976-322bc69bfa8d", "address": "fa:16:3e:bf:0c:61", "network": {"id": "90189b06-6aae-49a6-aa89-ec0c32b73181", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1482896272-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8cd94049fe334cddb1283a0046e9ae48", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46785c9c-8b22-487d-a854-b3e67c5ed1d7", "external-id": "nsx-vlan-transportzone-430", "segmentation_id": 430, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5f4dbdce-ac", "ovs_interfaceid": "5f4dbdce-acfb-4570-8976-322bc69bfa8d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1328.937735] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:bf:0c:61', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '46785c9c-8b22-487d-a854-b3e67c5ed1d7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5f4dbdce-acfb-4570-8976-322bc69bfa8d', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1328.945217] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Creating folder: Project (8cd94049fe334cddb1283a0046e9ae48). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1328.945708] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d95e6faf-eedb-4641-85b5-265bf51dda95 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.956350] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Created folder: Project (8cd94049fe334cddb1283a0046e9ae48) in parent group-v449629. [ 1328.956545] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Creating folder: Instances. Parent ref: group-v449713. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1328.956762] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f6994bec-1803-4046-b100-6a61c624652b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.965194] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Created folder: Instances in parent group-v449713. [ 1328.965409] env[60764]: DEBUG oslo.service.loopingcall [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1328.965584] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1328.965765] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-620ca4df-a649-442d-bf03-a872ba443d6a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.983484] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1328.983484] env[60764]: value = "task-2204979" [ 1328.983484] env[60764]: _type = "Task" [ 1328.983484] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1328.990312] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204979, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1329.493257] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204979, 'name': CreateVM_Task, 'duration_secs': 0.299807} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1329.493443] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1329.494136] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1329.494300] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1329.494636] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1329.494884] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9c6b854b-13b0-4d49-b7fd-94f968b9557d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1329.499341] env[60764]: DEBUG oslo_vmware.api [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Waiting for the task: (returnval){ [ 1329.499341] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52ca9e2d-7954-08ca-33d5-46dfe054030e" [ 1329.499341] env[60764]: _type = "Task" [ 1329.499341] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1329.506826] env[60764]: DEBUG oslo_vmware.api [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52ca9e2d-7954-08ca-33d5-46dfe054030e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1330.009444] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1330.009755] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1330.009960] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1330.645106] env[60764]: DEBUG nova.compute.manager [req-42e915bc-4b2b-496e-8ada-9b370e2a87b0 req-6aaef7df-8164-42e2-99a8-86f0a3d3f0dc service nova] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Received event network-changed-5f4dbdce-acfb-4570-8976-322bc69bfa8d {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1330.645332] env[60764]: DEBUG nova.compute.manager [req-42e915bc-4b2b-496e-8ada-9b370e2a87b0 req-6aaef7df-8164-42e2-99a8-86f0a3d3f0dc service nova] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Refreshing instance network info cache due to event network-changed-5f4dbdce-acfb-4570-8976-322bc69bfa8d. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1330.645534] env[60764]: DEBUG oslo_concurrency.lockutils [req-42e915bc-4b2b-496e-8ada-9b370e2a87b0 req-6aaef7df-8164-42e2-99a8-86f0a3d3f0dc service nova] Acquiring lock "refresh_cache-c645f7f5-528b-4719-96dd-8e50a46b4261" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1330.645675] env[60764]: DEBUG oslo_concurrency.lockutils [req-42e915bc-4b2b-496e-8ada-9b370e2a87b0 req-6aaef7df-8164-42e2-99a8-86f0a3d3f0dc service nova] Acquired lock "refresh_cache-c645f7f5-528b-4719-96dd-8e50a46b4261" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1330.645831] env[60764]: DEBUG nova.network.neutron [req-42e915bc-4b2b-496e-8ada-9b370e2a87b0 req-6aaef7df-8164-42e2-99a8-86f0a3d3f0dc service nova] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Refreshing network info cache for port 5f4dbdce-acfb-4570-8976-322bc69bfa8d {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1330.954175] env[60764]: DEBUG nova.network.neutron [req-42e915bc-4b2b-496e-8ada-9b370e2a87b0 req-6aaef7df-8164-42e2-99a8-86f0a3d3f0dc service nova] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Updated VIF entry in instance network info cache for port 5f4dbdce-acfb-4570-8976-322bc69bfa8d. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1330.954588] env[60764]: DEBUG nova.network.neutron [req-42e915bc-4b2b-496e-8ada-9b370e2a87b0 req-6aaef7df-8164-42e2-99a8-86f0a3d3f0dc service nova] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Updating instance_info_cache with network_info: [{"id": "5f4dbdce-acfb-4570-8976-322bc69bfa8d", "address": "fa:16:3e:bf:0c:61", "network": {"id": "90189b06-6aae-49a6-aa89-ec0c32b73181", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1482896272-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8cd94049fe334cddb1283a0046e9ae48", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46785c9c-8b22-487d-a854-b3e67c5ed1d7", "external-id": "nsx-vlan-transportzone-430", "segmentation_id": 430, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5f4dbdce-ac", "ovs_interfaceid": "5f4dbdce-acfb-4570-8976-322bc69bfa8d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1330.964092] env[60764]: DEBUG oslo_concurrency.lockutils [req-42e915bc-4b2b-496e-8ada-9b370e2a87b0 req-6aaef7df-8164-42e2-99a8-86f0a3d3f0dc service nova] Releasing lock "refresh_cache-c645f7f5-528b-4719-96dd-8e50a46b4261" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1337.329992] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1337.330278] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1337.330353] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1337.355074] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1337.355312] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1337.355504] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1337.355689] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1337.355865] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1337.356062] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1337.356243] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1337.356410] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 6397ff19-1385-4e38-b199-666394582582] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1337.356578] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a3199f59-f827-404e-8272-296129096180] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1337.356743] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1337.356913] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1339.329494] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1339.345115] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1339.345355] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1339.345523] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1339.345690] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1339.346836] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49d9088a-8afd-4ebf-9684-e0459f83e4d5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1339.355527] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25458cc7-4b57-4a61-af74-9b2c89d3cfa3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1339.369162] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28b8fa8a-9691-4c49-9ab0-1435e53535e7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1339.375270] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-047060fc-7e32-435e-b0dc-6c87e97d0a0b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1339.404662] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181265MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1339.404823] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1339.405025] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1339.480236] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 74b4bba7-8568-4fc4-a744-395a3271abc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1339.480398] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance dfd3e3af-90c9-420b-81ec-e9115c519016 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1339.480528] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e80dd396-f709-48d7-bc98-159b175f5593 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1339.480655] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ce8f8161-623c-4f88-8846-8f3b5a4ceabe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1339.480776] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aad42e7f-24c2-400e-8a1c-6baae2081e29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1339.480894] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1339.481019] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b3ca6987-3415-4db5-a514-cd66c342eb7f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1339.481143] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6397ff19-1385-4e38-b199-666394582582 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1339.481260] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a3199f59-f827-404e-8272-296129096180 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1339.481375] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c645f7f5-528b-4719-96dd-8e50a46b4261 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1339.492146] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 9debc548-4034-4ba7-93a4-915fc6ad3229 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1339.502011] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c8ddfd42-c132-48e3-bade-f103a1bdea07 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1339.511522] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 8b2206e9-2b46-44f8-a756-80be988926a4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1339.521967] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 94b4adb4-6119-489a-820e-701790136809 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1339.530348] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b0b63493-2864-4767-a20c-83db66f395c6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1339.541254] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f1e869e4-179c-4ea1-9a50-c560e9d2f78b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1339.551436] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c32c50ad-0818-478e-8cfa-c34902153a2c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1339.560395] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 51512549-4c6e-41d4-98b0-7d1e801a8b69 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1339.569134] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 8d616729-866d-4ebb-8bc7-cf2172b70382 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1339.578194] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 5ec9deb5-ad1c-4908-afc8-8e72f8b0cb85 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1339.586975] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance bf522599-8aa5-411a-96dd-8bd8328d9156 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1339.595672] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1339.595934] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1339.596113] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1339.827628] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a9b6825-1429-467a-be53-0d4ce71841ec {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1339.834809] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9fcbac1-9317-467d-961f-4b0c729a2fc2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1339.865461] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfa7a885-74dc-42e6-bd9b-b7dcfe1791ec {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1339.872211] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-896d21ab-3297-4991-b1d0-114b3707660e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1339.884625] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1339.893038] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1339.909063] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1339.909271] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.504s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1340.910712] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1341.330397] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1342.330032] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1343.326048] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1343.346071] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1344.347471] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1345.329638] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1345.330058] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1346.338366] env[60764]: DEBUG oslo_concurrency.lockutils [None req-689ac139-db04-40ab-b8a5-d067500c8d5b tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquiring lock "c645f7f5-528b-4719-96dd-8e50a46b4261" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1347.331492] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1356.033208] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Acquiring lock "a83f4609-4c09-4056-a840-cd899af93ea3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1356.033667] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Lock "a83f4609-4c09-4056-a840-cd899af93ea3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1356.719139] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "e96a6b8e-75b7-4a2f-a838-107603ad8b80" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1356.719384] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "e96a6b8e-75b7-4a2f-a838-107603ad8b80" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1372.469416] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a8024a74-db62-4fce-a55d-fe0906c49604 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Acquiring lock "7d4b7608-622c-41c8-9532-e216ed41db91" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1372.469953] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a8024a74-db62-4fce-a55d-fe0906c49604 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "7d4b7608-622c-41c8-9532-e216ed41db91" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1372.496268] env[60764]: DEBUG oslo_concurrency.lockutils [None req-628c6634-eeb2-45a5-b9a9-83dcc0c77749 tempest-ServersNegativeTestMultiTenantJSON-89438538 tempest-ServersNegativeTestMultiTenantJSON-89438538-project-member] Acquiring lock "6af78199-5a15-4ed8-94b3-abc98dbffe37" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1372.496513] env[60764]: DEBUG oslo_concurrency.lockutils [None req-628c6634-eeb2-45a5-b9a9-83dcc0c77749 tempest-ServersNegativeTestMultiTenantJSON-89438538 tempest-ServersNegativeTestMultiTenantJSON-89438538-project-member] Lock "6af78199-5a15-4ed8-94b3-abc98dbffe37" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1374.132770] env[60764]: WARNING oslo_vmware.rw_handles [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1374.132770] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1374.132770] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1374.132770] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1374.132770] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1374.132770] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1374.132770] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1374.132770] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1374.132770] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1374.132770] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1374.132770] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1374.132770] env[60764]: ERROR oslo_vmware.rw_handles [ 1374.133468] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/6411abe2-4ea0-4bdd-a955-004fd495d44a/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1374.134946] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1374.135307] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Copying Virtual Disk [datastore2] vmware_temp/6411abe2-4ea0-4bdd-a955-004fd495d44a/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/6411abe2-4ea0-4bdd-a955-004fd495d44a/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1374.135604] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3a2ec43b-aa6f-4e51-a30b-40ccaf26d7f7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1374.143606] env[60764]: DEBUG oslo_vmware.api [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Waiting for the task: (returnval){ [ 1374.143606] env[60764]: value = "task-2204980" [ 1374.143606] env[60764]: _type = "Task" [ 1374.143606] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1374.151642] env[60764]: DEBUG oslo_vmware.api [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Task: {'id': task-2204980, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1374.654536] env[60764]: DEBUG oslo_vmware.exceptions [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1374.654880] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1374.655509] env[60764]: ERROR nova.compute.manager [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1374.655509] env[60764]: Faults: ['InvalidArgument'] [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Traceback (most recent call last): [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] yield resources [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] self.driver.spawn(context, instance, image_meta, [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] self._fetch_image_if_missing(context, vi) [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] image_cache(vi, tmp_image_ds_loc) [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] vm_util.copy_virtual_disk( [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] session._wait_for_task(vmdk_copy_task) [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] return self.wait_for_task(task_ref) [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] return evt.wait() [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] result = hub.switch() [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] return self.greenlet.switch() [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] self.f(*self.args, **self.kw) [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] raise exceptions.translate_fault(task_info.error) [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Faults: ['InvalidArgument'] [ 1374.655509] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] [ 1374.656462] env[60764]: INFO nova.compute.manager [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Terminating instance [ 1374.657583] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1374.657829] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1374.658532] env[60764]: DEBUG nova.compute.manager [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1374.658761] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1374.659027] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-266c0ba4-df53-4267-86ff-ed6deb61b414 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1374.661567] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12459313-26a7-4cf6-9906-3a5cc9873442 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1374.669255] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1374.669436] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5891c856-ef25-4170-8a54-573425eca52e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1374.672020] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1374.672020] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1374.673012] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9a53035e-db44-42f9-9894-04d0e704617a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1374.677823] env[60764]: DEBUG oslo_vmware.api [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Waiting for the task: (returnval){ [ 1374.677823] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5292aa25-5380-e833-9e69-c39e45e83745" [ 1374.677823] env[60764]: _type = "Task" [ 1374.677823] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1374.691729] env[60764]: DEBUG oslo_vmware.api [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5292aa25-5380-e833-9e69-c39e45e83745, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1374.750535] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1374.750765] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1374.750939] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Deleting the datastore file [datastore2] 74b4bba7-8568-4fc4-a744-395a3271abc8 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1374.751242] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f6644750-0d02-4e38-b571-ea04017816b0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1374.757635] env[60764]: DEBUG oslo_vmware.api [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Waiting for the task: (returnval){ [ 1374.757635] env[60764]: value = "task-2204982" [ 1374.757635] env[60764]: _type = "Task" [ 1374.757635] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1374.765654] env[60764]: DEBUG oslo_vmware.api [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Task: {'id': task-2204982, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1375.188124] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1375.188489] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Creating directory with path [datastore2] vmware_temp/1142eea1-d8d7-43b3-8ba2-f3b34d4bfb9f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1375.188489] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-258c0dc3-c8c0-4266-81c4-12c59be41b3e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1375.199740] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Created directory with path [datastore2] vmware_temp/1142eea1-d8d7-43b3-8ba2-f3b34d4bfb9f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1375.199943] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Fetch image to [datastore2] vmware_temp/1142eea1-d8d7-43b3-8ba2-f3b34d4bfb9f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1375.200125] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/1142eea1-d8d7-43b3-8ba2-f3b34d4bfb9f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1375.200835] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d784a14-159d-4efe-85af-0b55e482ec92 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1375.207266] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-139b31ab-236a-48d4-a143-9575db552258 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1375.216278] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80d79cd6-b077-4a62-9c0c-69346600795d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1375.247384] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb8af270-860b-4870-8a1b-0a60020769bf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1375.253044] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-24f2cedb-cd06-4914-b4e2-4e8b1813f394 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1375.265181] env[60764]: DEBUG oslo_vmware.api [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Task: {'id': task-2204982, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081547} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1375.265595] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1375.265595] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1375.265744] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1375.265903] env[60764]: INFO nova.compute.manager [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1375.268131] env[60764]: DEBUG nova.compute.claims [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1375.268301] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1375.268519] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1375.274234] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1375.323287] env[60764]: DEBUG oslo_vmware.rw_handles [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1142eea1-d8d7-43b3-8ba2-f3b34d4bfb9f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1375.382164] env[60764]: DEBUG oslo_vmware.rw_handles [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1375.382971] env[60764]: DEBUG oslo_vmware.rw_handles [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1142eea1-d8d7-43b3-8ba2-f3b34d4bfb9f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1375.580503] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b8f50f5-6c84-41d9-ad66-e6ed199ee92a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1375.588146] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0402e2ca-f1f5-43c1-958f-e0f075e7d504 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1375.617373] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-402b222e-31cd-42e5-b754-edd086ed5490 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1375.624377] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f2cc88a-5cd1-4687-bf6a-7f9e581c8622 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1375.638164] env[60764]: DEBUG nova.compute.provider_tree [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1375.646738] env[60764]: DEBUG nova.scheduler.client.report [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1375.660947] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.392s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1375.661489] env[60764]: ERROR nova.compute.manager [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1375.661489] env[60764]: Faults: ['InvalidArgument'] [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Traceback (most recent call last): [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] self.driver.spawn(context, instance, image_meta, [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] self._fetch_image_if_missing(context, vi) [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] image_cache(vi, tmp_image_ds_loc) [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] vm_util.copy_virtual_disk( [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] session._wait_for_task(vmdk_copy_task) [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] return self.wait_for_task(task_ref) [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] return evt.wait() [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] result = hub.switch() [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] return self.greenlet.switch() [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] self.f(*self.args, **self.kw) [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] raise exceptions.translate_fault(task_info.error) [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Faults: ['InvalidArgument'] [ 1375.661489] env[60764]: ERROR nova.compute.manager [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] [ 1375.662375] env[60764]: DEBUG nova.compute.utils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1375.663579] env[60764]: DEBUG nova.compute.manager [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Build of instance 74b4bba7-8568-4fc4-a744-395a3271abc8 was re-scheduled: A specified parameter was not correct: fileType [ 1375.663579] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1375.663944] env[60764]: DEBUG nova.compute.manager [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1375.664132] env[60764]: DEBUG nova.compute.manager [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1375.664304] env[60764]: DEBUG nova.compute.manager [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1375.664465] env[60764]: DEBUG nova.network.neutron [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1376.112930] env[60764]: DEBUG nova.network.neutron [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1376.129026] env[60764]: INFO nova.compute.manager [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Took 0.46 seconds to deallocate network for instance. [ 1376.219907] env[60764]: INFO nova.scheduler.client.report [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Deleted allocations for instance 74b4bba7-8568-4fc4-a744-395a3271abc8 [ 1376.238515] env[60764]: DEBUG oslo_concurrency.lockutils [None req-be0b232f-8d09-49af-ac30-b6af3b249483 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Lock "74b4bba7-8568-4fc4-a744-395a3271abc8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 626.475s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1376.239649] env[60764]: DEBUG oslo_concurrency.lockutils [None req-021b55b2-a351-4bb3-9bd7-2e22c3e76bd8 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Lock "74b4bba7-8568-4fc4-a744-395a3271abc8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 429.978s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1376.239858] env[60764]: DEBUG oslo_concurrency.lockutils [None req-021b55b2-a351-4bb3-9bd7-2e22c3e76bd8 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Acquiring lock "74b4bba7-8568-4fc4-a744-395a3271abc8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1376.240075] env[60764]: DEBUG oslo_concurrency.lockutils [None req-021b55b2-a351-4bb3-9bd7-2e22c3e76bd8 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Lock "74b4bba7-8568-4fc4-a744-395a3271abc8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1376.240244] env[60764]: DEBUG oslo_concurrency.lockutils [None req-021b55b2-a351-4bb3-9bd7-2e22c3e76bd8 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Lock "74b4bba7-8568-4fc4-a744-395a3271abc8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1376.242199] env[60764]: INFO nova.compute.manager [None req-021b55b2-a351-4bb3-9bd7-2e22c3e76bd8 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Terminating instance [ 1376.243981] env[60764]: DEBUG nova.compute.manager [None req-021b55b2-a351-4bb3-9bd7-2e22c3e76bd8 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1376.244198] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-021b55b2-a351-4bb3-9bd7-2e22c3e76bd8 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1376.244653] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-89557362-9ccd-4070-8f92-565c58e0309a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1376.253958] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6544531f-3dd2-424f-afa2-8b4af3ba4a59 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1376.265200] env[60764]: DEBUG nova.compute.manager [None req-08f96d72-175b-439b-9ccf-2ceeaa03052f tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: 9debc548-4034-4ba7-93a4-915fc6ad3229] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1376.285593] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-021b55b2-a351-4bb3-9bd7-2e22c3e76bd8 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 74b4bba7-8568-4fc4-a744-395a3271abc8 could not be found. [ 1376.285803] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-021b55b2-a351-4bb3-9bd7-2e22c3e76bd8 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1376.285995] env[60764]: INFO nova.compute.manager [None req-021b55b2-a351-4bb3-9bd7-2e22c3e76bd8 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1376.286276] env[60764]: DEBUG oslo.service.loopingcall [None req-021b55b2-a351-4bb3-9bd7-2e22c3e76bd8 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1376.286504] env[60764]: DEBUG nova.compute.manager [-] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1376.286602] env[60764]: DEBUG nova.network.neutron [-] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1376.289031] env[60764]: DEBUG nova.compute.manager [None req-08f96d72-175b-439b-9ccf-2ceeaa03052f tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: 9debc548-4034-4ba7-93a4-915fc6ad3229] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1376.310603] env[60764]: DEBUG oslo_concurrency.lockutils [None req-08f96d72-175b-439b-9ccf-2ceeaa03052f tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Lock "9debc548-4034-4ba7-93a4-915fc6ad3229" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.555s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1376.312166] env[60764]: DEBUG nova.network.neutron [-] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1376.320197] env[60764]: INFO nova.compute.manager [-] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] Took 0.03 seconds to deallocate network for instance. [ 1376.322086] env[60764]: DEBUG nova.compute.manager [None req-28f9bae5-7746-4e94-a457-441894b0ca0d tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: c8ddfd42-c132-48e3-bade-f103a1bdea07] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1376.346760] env[60764]: DEBUG nova.compute.manager [None req-28f9bae5-7746-4e94-a457-441894b0ca0d tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: c8ddfd42-c132-48e3-bade-f103a1bdea07] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1376.367330] env[60764]: DEBUG oslo_concurrency.lockutils [None req-28f9bae5-7746-4e94-a457-441894b0ca0d tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "c8ddfd42-c132-48e3-bade-f103a1bdea07" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.382s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1376.377056] env[60764]: DEBUG nova.compute.manager [None req-bc90d127-57da-484a-a5ec-aaece59bdaf5 tempest-ServersAdminTestJSON-275582914 tempest-ServersAdminTestJSON-275582914-project-member] [instance: 8b2206e9-2b46-44f8-a756-80be988926a4] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1376.426818] env[60764]: DEBUG nova.compute.manager [None req-bc90d127-57da-484a-a5ec-aaece59bdaf5 tempest-ServersAdminTestJSON-275582914 tempest-ServersAdminTestJSON-275582914-project-member] [instance: 8b2206e9-2b46-44f8-a756-80be988926a4] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1376.447167] env[60764]: DEBUG oslo_concurrency.lockutils [None req-021b55b2-a351-4bb3-9bd7-2e22c3e76bd8 tempest-ServerActionsTestOtherA-1158385664 tempest-ServerActionsTestOtherA-1158385664-project-member] Lock "74b4bba7-8568-4fc4-a744-395a3271abc8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.207s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1376.447656] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "74b4bba7-8568-4fc4-a744-395a3271abc8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 280.071s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1376.448282] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 74b4bba7-8568-4fc4-a744-395a3271abc8] During sync_power_state the instance has a pending task (deleting). Skip. [ 1376.448282] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "74b4bba7-8568-4fc4-a744-395a3271abc8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1376.452681] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bc90d127-57da-484a-a5ec-aaece59bdaf5 tempest-ServersAdminTestJSON-275582914 tempest-ServersAdminTestJSON-275582914-project-member] Lock "8b2206e9-2b46-44f8-a756-80be988926a4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.257s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1376.463291] env[60764]: DEBUG nova.compute.manager [None req-38395585-9e79-4c53-b9ea-0234f09eb1b9 tempest-ServersAdminTestJSON-275582914 tempest-ServersAdminTestJSON-275582914-project-member] [instance: 94b4adb4-6119-489a-820e-701790136809] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1376.490988] env[60764]: DEBUG nova.compute.manager [None req-38395585-9e79-4c53-b9ea-0234f09eb1b9 tempest-ServersAdminTestJSON-275582914 tempest-ServersAdminTestJSON-275582914-project-member] [instance: 94b4adb4-6119-489a-820e-701790136809] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1376.515590] env[60764]: DEBUG oslo_concurrency.lockutils [None req-38395585-9e79-4c53-b9ea-0234f09eb1b9 tempest-ServersAdminTestJSON-275582914 tempest-ServersAdminTestJSON-275582914-project-member] Lock "94b4adb4-6119-489a-820e-701790136809" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.878s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1376.524478] env[60764]: DEBUG nova.compute.manager [None req-88ac6484-7180-4d99-85f1-68409d37ad73 tempest-ListImageFiltersTestJSON-1608500095 tempest-ListImageFiltersTestJSON-1608500095-project-member] [instance: b0b63493-2864-4767-a20c-83db66f395c6] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1376.547713] env[60764]: DEBUG nova.compute.manager [None req-88ac6484-7180-4d99-85f1-68409d37ad73 tempest-ListImageFiltersTestJSON-1608500095 tempest-ListImageFiltersTestJSON-1608500095-project-member] [instance: b0b63493-2864-4767-a20c-83db66f395c6] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1376.588797] env[60764]: DEBUG oslo_concurrency.lockutils [None req-88ac6484-7180-4d99-85f1-68409d37ad73 tempest-ListImageFiltersTestJSON-1608500095 tempest-ListImageFiltersTestJSON-1608500095-project-member] Lock "b0b63493-2864-4767-a20c-83db66f395c6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.057s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1376.616962] env[60764]: DEBUG nova.compute.manager [None req-3d8629e2-1b5f-482f-8979-ff064b10ddb3 tempest-ListImageFiltersTestJSON-1608500095 tempest-ListImageFiltersTestJSON-1608500095-project-member] [instance: f1e869e4-179c-4ea1-9a50-c560e9d2f78b] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1376.641850] env[60764]: DEBUG nova.compute.manager [None req-3d8629e2-1b5f-482f-8979-ff064b10ddb3 tempest-ListImageFiltersTestJSON-1608500095 tempest-ListImageFiltersTestJSON-1608500095-project-member] [instance: f1e869e4-179c-4ea1-9a50-c560e9d2f78b] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1376.666635] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3d8629e2-1b5f-482f-8979-ff064b10ddb3 tempest-ListImageFiltersTestJSON-1608500095 tempest-ListImageFiltersTestJSON-1608500095-project-member] Lock "f1e869e4-179c-4ea1-9a50-c560e9d2f78b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.811s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1376.677567] env[60764]: DEBUG nova.compute.manager [None req-cbfc7571-c035-4164-bcaf-251391837c92 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: c32c50ad-0818-478e-8cfa-c34902153a2c] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1376.703409] env[60764]: DEBUG nova.compute.manager [None req-cbfc7571-c035-4164-bcaf-251391837c92 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: c32c50ad-0818-478e-8cfa-c34902153a2c] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1376.725068] env[60764]: DEBUG oslo_concurrency.lockutils [None req-cbfc7571-c035-4164-bcaf-251391837c92 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "c32c50ad-0818-478e-8cfa-c34902153a2c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.567s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1376.734735] env[60764]: DEBUG nova.compute.manager [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1376.783164] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1376.783434] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1376.784876] env[60764]: INFO nova.compute.claims [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1377.044917] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa08fbc9-a633-4492-9936-66fb3a788c6a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1377.052779] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f16fee1-5807-47ee-ae11-fa3bc8a18e81 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1377.082273] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d42fe6ad-ec14-49ce-995a-f49241fab13a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1377.089243] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-556ea129-e42c-42a5-a0ae-235f63643fa6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1377.102014] env[60764]: DEBUG nova.compute.provider_tree [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1377.110349] env[60764]: DEBUG nova.scheduler.client.report [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1377.124080] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.341s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1377.136068] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Acquiring lock "a3fb8e21-6a4f-451a-8455-0bd41b189648" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1377.136330] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Lock "a3fb8e21-6a4f-451a-8455-0bd41b189648" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1377.142019] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Lock "a3fb8e21-6a4f-451a-8455-0bd41b189648" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: held 0.005s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1377.142019] env[60764]: DEBUG nova.compute.manager [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1377.172052] env[60764]: DEBUG nova.compute.utils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1377.173064] env[60764]: DEBUG nova.compute.manager [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1377.173240] env[60764]: DEBUG nova.network.neutron [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1377.182661] env[60764]: DEBUG nova.compute.manager [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1377.246101] env[60764]: DEBUG nova.compute.manager [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1377.262526] env[60764]: DEBUG nova.policy [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cb349e2f49e54e23a10f24f423d16b97', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e09a52004ad340b8b206679b26484646', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1377.272673] env[60764]: DEBUG nova.virt.hardware [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1377.272673] env[60764]: DEBUG nova.virt.hardware [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1377.272868] env[60764]: DEBUG nova.virt.hardware [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1377.273040] env[60764]: DEBUG nova.virt.hardware [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1377.273230] env[60764]: DEBUG nova.virt.hardware [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1377.273383] env[60764]: DEBUG nova.virt.hardware [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1377.273649] env[60764]: DEBUG nova.virt.hardware [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1377.273732] env[60764]: DEBUG nova.virt.hardware [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1377.274031] env[60764]: DEBUG nova.virt.hardware [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1377.274148] env[60764]: DEBUG nova.virt.hardware [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1377.274231] env[60764]: DEBUG nova.virt.hardware [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1377.275097] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7c865eb-e377-4377-97ce-5bc91a91bdc9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1377.283714] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05d5feb1-6d93-499c-93aa-2b0129ded27b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1377.562032] env[60764]: DEBUG nova.network.neutron [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Successfully created port: d5162840-b594-4001-aff2-baddd85ec5d4 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1378.446095] env[60764]: DEBUG nova.compute.manager [req-101f562f-03ad-441a-8f1a-462386e94cf7 req-3de3b923-a870-4f90-b920-1a4d6b343f96 service nova] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Received event network-vif-plugged-d5162840-b594-4001-aff2-baddd85ec5d4 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1378.446095] env[60764]: DEBUG oslo_concurrency.lockutils [req-101f562f-03ad-441a-8f1a-462386e94cf7 req-3de3b923-a870-4f90-b920-1a4d6b343f96 service nova] Acquiring lock "51512549-4c6e-41d4-98b0-7d1e801a8b69-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1378.446095] env[60764]: DEBUG oslo_concurrency.lockutils [req-101f562f-03ad-441a-8f1a-462386e94cf7 req-3de3b923-a870-4f90-b920-1a4d6b343f96 service nova] Lock "51512549-4c6e-41d4-98b0-7d1e801a8b69-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1378.446095] env[60764]: DEBUG oslo_concurrency.lockutils [req-101f562f-03ad-441a-8f1a-462386e94cf7 req-3de3b923-a870-4f90-b920-1a4d6b343f96 service nova] Lock "51512549-4c6e-41d4-98b0-7d1e801a8b69-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1378.446095] env[60764]: DEBUG nova.compute.manager [req-101f562f-03ad-441a-8f1a-462386e94cf7 req-3de3b923-a870-4f90-b920-1a4d6b343f96 service nova] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] No waiting events found dispatching network-vif-plugged-d5162840-b594-4001-aff2-baddd85ec5d4 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1378.446095] env[60764]: WARNING nova.compute.manager [req-101f562f-03ad-441a-8f1a-462386e94cf7 req-3de3b923-a870-4f90-b920-1a4d6b343f96 service nova] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Received unexpected event network-vif-plugged-d5162840-b594-4001-aff2-baddd85ec5d4 for instance with vm_state building and task_state spawning. [ 1378.528903] env[60764]: DEBUG nova.network.neutron [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Successfully updated port: d5162840-b594-4001-aff2-baddd85ec5d4 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1378.544062] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Acquiring lock "refresh_cache-51512549-4c6e-41d4-98b0-7d1e801a8b69" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1378.544337] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Acquired lock "refresh_cache-51512549-4c6e-41d4-98b0-7d1e801a8b69" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1378.544413] env[60764]: DEBUG nova.network.neutron [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1378.607586] env[60764]: DEBUG nova.network.neutron [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1378.830100] env[60764]: DEBUG nova.network.neutron [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Updating instance_info_cache with network_info: [{"id": "d5162840-b594-4001-aff2-baddd85ec5d4", "address": "fa:16:3e:87:34:b0", "network": {"id": "03f242d1-819c-4871-ad6b-d72f0a390694", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1282276229-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e09a52004ad340b8b206679b26484646", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "205fb402-8eaf-4b61-8f57-8f216024179a", "external-id": "nsx-vlan-transportzone-78", "segmentation_id": 78, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd5162840-b5", "ovs_interfaceid": "d5162840-b594-4001-aff2-baddd85ec5d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1378.845173] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Releasing lock "refresh_cache-51512549-4c6e-41d4-98b0-7d1e801a8b69" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1378.845429] env[60764]: DEBUG nova.compute.manager [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Instance network_info: |[{"id": "d5162840-b594-4001-aff2-baddd85ec5d4", "address": "fa:16:3e:87:34:b0", "network": {"id": "03f242d1-819c-4871-ad6b-d72f0a390694", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1282276229-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e09a52004ad340b8b206679b26484646", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "205fb402-8eaf-4b61-8f57-8f216024179a", "external-id": "nsx-vlan-transportzone-78", "segmentation_id": 78, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd5162840-b5", "ovs_interfaceid": "d5162840-b594-4001-aff2-baddd85ec5d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1378.845836] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:87:34:b0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '205fb402-8eaf-4b61-8f57-8f216024179a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd5162840-b594-4001-aff2-baddd85ec5d4', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1378.853759] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Creating folder: Project (e09a52004ad340b8b206679b26484646). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1378.854460] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-35a7d64a-59d7-4508-9dea-235e208cea38 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1378.865493] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Created folder: Project (e09a52004ad340b8b206679b26484646) in parent group-v449629. [ 1378.865493] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Creating folder: Instances. Parent ref: group-v449716. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1378.865493] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4cf6c8d8-5bef-4e0e-a78e-25089c16c186 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1378.873647] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Created folder: Instances in parent group-v449716. [ 1378.873896] env[60764]: DEBUG oslo.service.loopingcall [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1378.874100] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1378.874314] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-db538610-fc85-4e97-aaef-b466ed29906e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1378.906391] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1378.906391] env[60764]: value = "task-2204985" [ 1378.906391] env[60764]: _type = "Task" [ 1378.906391] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1378.918333] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204985, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1379.416906] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204985, 'name': CreateVM_Task, 'duration_secs': 0.3236} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1379.417186] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1379.418051] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1379.418263] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1379.418708] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1379.419011] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-36364460-d9a7-4a51-a15c-7f4d707520ea {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.424166] env[60764]: DEBUG oslo_vmware.api [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Waiting for the task: (returnval){ [ 1379.424166] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52b1604c-05c9-d0c6-eee8-fe7b4c5c0ef8" [ 1379.424166] env[60764]: _type = "Task" [ 1379.424166] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1379.435443] env[60764]: DEBUG oslo_vmware.api [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52b1604c-05c9-d0c6-eee8-fe7b4c5c0ef8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1379.934806] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1379.935569] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1379.935900] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1380.477054] env[60764]: DEBUG nova.compute.manager [req-48fed652-5dbc-4762-a97d-230018462da7 req-8216a3c7-8516-489a-9da9-f907f2e1fb33 service nova] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Received event network-changed-d5162840-b594-4001-aff2-baddd85ec5d4 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1380.477272] env[60764]: DEBUG nova.compute.manager [req-48fed652-5dbc-4762-a97d-230018462da7 req-8216a3c7-8516-489a-9da9-f907f2e1fb33 service nova] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Refreshing instance network info cache due to event network-changed-d5162840-b594-4001-aff2-baddd85ec5d4. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1380.477489] env[60764]: DEBUG oslo_concurrency.lockutils [req-48fed652-5dbc-4762-a97d-230018462da7 req-8216a3c7-8516-489a-9da9-f907f2e1fb33 service nova] Acquiring lock "refresh_cache-51512549-4c6e-41d4-98b0-7d1e801a8b69" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1380.477630] env[60764]: DEBUG oslo_concurrency.lockutils [req-48fed652-5dbc-4762-a97d-230018462da7 req-8216a3c7-8516-489a-9da9-f907f2e1fb33 service nova] Acquired lock "refresh_cache-51512549-4c6e-41d4-98b0-7d1e801a8b69" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1380.477786] env[60764]: DEBUG nova.network.neutron [req-48fed652-5dbc-4762-a97d-230018462da7 req-8216a3c7-8516-489a-9da9-f907f2e1fb33 service nova] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Refreshing network info cache for port d5162840-b594-4001-aff2-baddd85ec5d4 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1380.800548] env[60764]: DEBUG nova.network.neutron [req-48fed652-5dbc-4762-a97d-230018462da7 req-8216a3c7-8516-489a-9da9-f907f2e1fb33 service nova] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Updated VIF entry in instance network info cache for port d5162840-b594-4001-aff2-baddd85ec5d4. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1380.801547] env[60764]: DEBUG nova.network.neutron [req-48fed652-5dbc-4762-a97d-230018462da7 req-8216a3c7-8516-489a-9da9-f907f2e1fb33 service nova] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Updating instance_info_cache with network_info: [{"id": "d5162840-b594-4001-aff2-baddd85ec5d4", "address": "fa:16:3e:87:34:b0", "network": {"id": "03f242d1-819c-4871-ad6b-d72f0a390694", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1282276229-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e09a52004ad340b8b206679b26484646", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "205fb402-8eaf-4b61-8f57-8f216024179a", "external-id": "nsx-vlan-transportzone-78", "segmentation_id": 78, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd5162840-b5", "ovs_interfaceid": "d5162840-b594-4001-aff2-baddd85ec5d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1380.810125] env[60764]: DEBUG oslo_concurrency.lockutils [req-48fed652-5dbc-4762-a97d-230018462da7 req-8216a3c7-8516-489a-9da9-f907f2e1fb33 service nova] Releasing lock "refresh_cache-51512549-4c6e-41d4-98b0-7d1e801a8b69" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1384.999820] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Acquiring lock "2ea05216-40c5-4482-a1d8-278f7ea3d28b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1385.000530] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Lock "2ea05216-40c5-4482-a1d8-278f7ea3d28b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1385.295283] env[60764]: DEBUG oslo_concurrency.lockutils [None req-02bf7387-36eb-449f-88a5-40e5606e4850 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Acquiring lock "51512549-4c6e-41d4-98b0-7d1e801a8b69" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1396.933256] env[60764]: DEBUG oslo_concurrency.lockutils [None req-22d536c8-b277-4c82-8efc-8c24bb0bb856 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Acquiring lock "e7742bf9-cd57-4a84-853f-886e5bc5a6b8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1396.933558] env[60764]: DEBUG oslo_concurrency.lockutils [None req-22d536c8-b277-4c82-8efc-8c24bb0bb856 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "e7742bf9-cd57-4a84-853f-886e5bc5a6b8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1398.215864] env[60764]: DEBUG oslo_concurrency.lockutils [None req-61bb0ee9-be2b-4135-b445-5e61e2821973 tempest-AttachVolumeNegativeTest-1914834048 tempest-AttachVolumeNegativeTest-1914834048-project-member] Acquiring lock "f13299b8-2c86-41a2-b14c-bfd68ab6dd22" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1398.216165] env[60764]: DEBUG oslo_concurrency.lockutils [None req-61bb0ee9-be2b-4135-b445-5e61e2821973 tempest-AttachVolumeNegativeTest-1914834048 tempest-AttachVolumeNegativeTest-1914834048-project-member] Lock "f13299b8-2c86-41a2-b14c-bfd68ab6dd22" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1398.330337] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1398.330517] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1398.330641] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1398.351581] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1398.351746] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1398.351886] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1398.352032] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1398.352170] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1398.352293] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1398.352415] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 6397ff19-1385-4e38-b199-666394582582] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1398.352535] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a3199f59-f827-404e-8272-296129096180] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1398.352657] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1398.352773] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1398.352952] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1400.330205] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1400.330479] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1400.341360] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1400.341576] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1400.341748] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1400.341930] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1400.343092] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebe604be-93d3-491d-86d2-694e5ae30374 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.351911] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd6c6f2d-bdd6-47a3-83d4-d2926d9a4285 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.365544] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1365390d-5de5-4bb5-86b8-2d8d4118430a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.371960] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7df858cb-1d1d-4d78-aeaf-c4a571cc3b80 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.401603] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181219MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1400.401751] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1400.401934] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1400.544768] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance dfd3e3af-90c9-420b-81ec-e9115c519016 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1400.544948] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e80dd396-f709-48d7-bc98-159b175f5593 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1400.545093] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ce8f8161-623c-4f88-8846-8f3b5a4ceabe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1400.545223] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aad42e7f-24c2-400e-8a1c-6baae2081e29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1400.545345] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1400.545463] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b3ca6987-3415-4db5-a514-cd66c342eb7f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1400.545580] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6397ff19-1385-4e38-b199-666394582582 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1400.545790] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a3199f59-f827-404e-8272-296129096180 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1400.545941] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c645f7f5-528b-4719-96dd-8e50a46b4261 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1400.546072] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 51512549-4c6e-41d4-98b0-7d1e801a8b69 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1400.557225] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance bf522599-8aa5-411a-96dd-8bd8328d9156 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1400.567467] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1400.576640] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a83f4609-4c09-4056-a840-cd899af93ea3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1400.585341] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e96a6b8e-75b7-4a2f-a838-107603ad8b80 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1400.594023] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7d4b7608-622c-41c8-9532-e216ed41db91 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1400.602420] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6af78199-5a15-4ed8-94b3-abc98dbffe37 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1400.610862] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2ea05216-40c5-4482-a1d8-278f7ea3d28b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1400.620342] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e7742bf9-cd57-4a84-853f-886e5bc5a6b8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1400.629164] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f13299b8-2c86-41a2-b14c-bfd68ab6dd22 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1400.629399] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1400.629581] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1400.647517] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Refreshing inventories for resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1400.661518] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Updating ProviderTree inventory for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1400.661712] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Updating inventory in ProviderTree for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1400.672302] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Refreshing aggregate associations for resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4, aggregates: None {{(pid=60764) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1400.689118] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Refreshing trait associations for resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_VMDK {{(pid=60764) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1400.907603] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90242702-f883-4bb2-a17f-7e9cdf6be4e0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.915115] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb17bf5c-4819-4f13-883f-6f4e2ce4830b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.943505] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49ac6f9f-9337-4283-9850-527c179ccd65 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.950191] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f3f6bf8-3e7d-4209-9a1e-97ebf2a32e78 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1400.964213] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1400.972877] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1400.986977] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1400.987217] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.585s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1400.987434] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1400.987573] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Cleaning up deleted instances with incomplete migration {{(pid=60764) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1402.995147] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1403.329792] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1403.330095] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1404.330943] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1404.331226] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Cleaning up deleted instances {{(pid=60764) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1404.341869] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] There are 0 instances to clean {{(pid=60764) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1404.342106] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1406.342859] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1407.330650] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1407.330936] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1407.331091] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1424.150056] env[60764]: WARNING oslo_vmware.rw_handles [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1424.150056] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1424.150056] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1424.150056] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1424.150056] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1424.150056] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1424.150056] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1424.150056] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1424.150056] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1424.150056] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1424.150056] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1424.150056] env[60764]: ERROR oslo_vmware.rw_handles [ 1424.151167] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/1142eea1-d8d7-43b3-8ba2-f3b34d4bfb9f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1424.152636] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1424.152885] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Copying Virtual Disk [datastore2] vmware_temp/1142eea1-d8d7-43b3-8ba2-f3b34d4bfb9f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/1142eea1-d8d7-43b3-8ba2-f3b34d4bfb9f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1424.153201] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-99c53f89-de9e-495c-9e72-4516d17907b1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1424.162530] env[60764]: DEBUG oslo_vmware.api [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Waiting for the task: (returnval){ [ 1424.162530] env[60764]: value = "task-2204986" [ 1424.162530] env[60764]: _type = "Task" [ 1424.162530] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1424.173129] env[60764]: DEBUG oslo_vmware.api [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Task: {'id': task-2204986, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1424.673272] env[60764]: DEBUG oslo_vmware.exceptions [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1424.673544] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1424.674129] env[60764]: ERROR nova.compute.manager [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1424.674129] env[60764]: Faults: ['InvalidArgument'] [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Traceback (most recent call last): [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] yield resources [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] self.driver.spawn(context, instance, image_meta, [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] self._fetch_image_if_missing(context, vi) [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] image_cache(vi, tmp_image_ds_loc) [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] vm_util.copy_virtual_disk( [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] session._wait_for_task(vmdk_copy_task) [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] return self.wait_for_task(task_ref) [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] return evt.wait() [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] result = hub.switch() [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] return self.greenlet.switch() [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] self.f(*self.args, **self.kw) [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] raise exceptions.translate_fault(task_info.error) [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Faults: ['InvalidArgument'] [ 1424.674129] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] [ 1424.674971] env[60764]: INFO nova.compute.manager [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Terminating instance [ 1424.676038] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1424.676255] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1424.676562] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5b0a4997-15ee-438f-a037-8018692eb072 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1424.679218] env[60764]: DEBUG nova.compute.manager [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1424.679422] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1424.680263] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dab2598-6e58-44aa-843e-4d29618d2771 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1424.687411] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1424.687672] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-226672af-28e5-4569-abb1-50f2f820cbbd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1424.690022] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1424.690205] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1424.691192] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ea9b1eb0-9760-46af-ab8b-2a5e15409424 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1424.696011] env[60764]: DEBUG oslo_vmware.api [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Waiting for the task: (returnval){ [ 1424.696011] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52c1a63b-3d3d-a79a-1f14-4fe9a53a5184" [ 1424.696011] env[60764]: _type = "Task" [ 1424.696011] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1424.704582] env[60764]: DEBUG oslo_vmware.api [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52c1a63b-3d3d-a79a-1f14-4fe9a53a5184, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1424.761404] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1424.761647] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1424.761807] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Deleting the datastore file [datastore2] dfd3e3af-90c9-420b-81ec-e9115c519016 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1424.762102] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-727ef1ad-6bb9-41c5-a41d-a396e18f1f2c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1424.768417] env[60764]: DEBUG oslo_vmware.api [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Waiting for the task: (returnval){ [ 1424.768417] env[60764]: value = "task-2204988" [ 1424.768417] env[60764]: _type = "Task" [ 1424.768417] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1424.776674] env[60764]: DEBUG oslo_vmware.api [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Task: {'id': task-2204988, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1425.206911] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1425.208045] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Creating directory with path [datastore2] vmware_temp/d43b1825-62e3-40e7-a191-a2a1b7be58bb/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1425.208045] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-34515c2e-ba5a-47aa-9a59-3623b93c41cd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1425.218578] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Created directory with path [datastore2] vmware_temp/d43b1825-62e3-40e7-a191-a2a1b7be58bb/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1425.218769] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Fetch image to [datastore2] vmware_temp/d43b1825-62e3-40e7-a191-a2a1b7be58bb/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1425.218935] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/d43b1825-62e3-40e7-a191-a2a1b7be58bb/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1425.219700] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-123205f7-c84c-4c4c-a3b8-2c0162fee802 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1425.226100] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97717233-3693-4ff7-a8a9-fee4270d1a6f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1425.235783] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f0188fc-7165-4841-beee-e446e91dd1ed {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1425.265310] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96da9440-5587-407d-9d08-15defca40a06 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1425.273357] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-20493fa9-55b7-4d95-aefd-ea6f7eb16c01 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1425.277532] env[60764]: DEBUG oslo_vmware.api [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Task: {'id': task-2204988, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080515} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1425.278046] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1425.278274] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1425.278468] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1425.278643] env[60764]: INFO nova.compute.manager [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1425.280740] env[60764]: DEBUG nova.compute.claims [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1425.280907] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1425.281134] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1425.296240] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1425.352513] env[60764]: DEBUG oslo_vmware.rw_handles [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d43b1825-62e3-40e7-a191-a2a1b7be58bb/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1425.413160] env[60764]: DEBUG oslo_vmware.rw_handles [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1425.413364] env[60764]: DEBUG oslo_vmware.rw_handles [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d43b1825-62e3-40e7-a191-a2a1b7be58bb/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1425.590253] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-038a227c-9a5b-44cb-8d33-d6e3a93da3a2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1425.597574] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ab7f0d3-800b-4cd2-8beb-fae475842e45 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1425.627648] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8119f7d0-11fe-4238-b398-a4f4e7802691 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1425.634415] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0501e931-4749-4761-b553-869004087860 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1425.647303] env[60764]: DEBUG nova.compute.provider_tree [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1425.655302] env[60764]: DEBUG nova.scheduler.client.report [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1425.670558] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.389s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1425.671089] env[60764]: ERROR nova.compute.manager [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1425.671089] env[60764]: Faults: ['InvalidArgument'] [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Traceback (most recent call last): [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] self.driver.spawn(context, instance, image_meta, [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] self._fetch_image_if_missing(context, vi) [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] image_cache(vi, tmp_image_ds_loc) [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] vm_util.copy_virtual_disk( [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] session._wait_for_task(vmdk_copy_task) [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] return self.wait_for_task(task_ref) [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] return evt.wait() [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] result = hub.switch() [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] return self.greenlet.switch() [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] self.f(*self.args, **self.kw) [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] raise exceptions.translate_fault(task_info.error) [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Faults: ['InvalidArgument'] [ 1425.671089] env[60764]: ERROR nova.compute.manager [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] [ 1425.671990] env[60764]: DEBUG nova.compute.utils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1425.673158] env[60764]: DEBUG nova.compute.manager [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Build of instance dfd3e3af-90c9-420b-81ec-e9115c519016 was re-scheduled: A specified parameter was not correct: fileType [ 1425.673158] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1425.673526] env[60764]: DEBUG nova.compute.manager [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1425.673696] env[60764]: DEBUG nova.compute.manager [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1425.673861] env[60764]: DEBUG nova.compute.manager [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1425.674060] env[60764]: DEBUG nova.network.neutron [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1426.036419] env[60764]: DEBUG nova.network.neutron [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1426.046304] env[60764]: INFO nova.compute.manager [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Took 0.37 seconds to deallocate network for instance. [ 1426.138870] env[60764]: INFO nova.scheduler.client.report [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Deleted allocations for instance dfd3e3af-90c9-420b-81ec-e9115c519016 [ 1426.159353] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d86d7d6c-e819-461f-90af-639d170cc64a tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "dfd3e3af-90c9-420b-81ec-e9115c519016" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 659.624s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1426.160641] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d704c046-69ae-4d93-ba68-785e58329b37 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "dfd3e3af-90c9-420b-81ec-e9115c519016" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 463.272s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1426.160827] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d704c046-69ae-4d93-ba68-785e58329b37 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "dfd3e3af-90c9-420b-81ec-e9115c519016-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1426.161062] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d704c046-69ae-4d93-ba68-785e58329b37 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "dfd3e3af-90c9-420b-81ec-e9115c519016-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1426.161246] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d704c046-69ae-4d93-ba68-785e58329b37 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "dfd3e3af-90c9-420b-81ec-e9115c519016-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1426.163363] env[60764]: INFO nova.compute.manager [None req-d704c046-69ae-4d93-ba68-785e58329b37 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Terminating instance [ 1426.165293] env[60764]: DEBUG nova.compute.manager [None req-d704c046-69ae-4d93-ba68-785e58329b37 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1426.165484] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d704c046-69ae-4d93-ba68-785e58329b37 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1426.165949] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f67c151a-567b-4a85-966f-2e7e57cc2faf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1426.177101] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8779e512-57e3-4cf3-9706-db6f680e6b52 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1426.188205] env[60764]: DEBUG nova.compute.manager [None req-1b7fa0df-ee2d-4ed8-a9f7-23f126c65601 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 8d616729-866d-4ebb-8bc7-cf2172b70382] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1426.208089] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-d704c046-69ae-4d93-ba68-785e58329b37 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance dfd3e3af-90c9-420b-81ec-e9115c519016 could not be found. [ 1426.208332] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d704c046-69ae-4d93-ba68-785e58329b37 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1426.208505] env[60764]: INFO nova.compute.manager [None req-d704c046-69ae-4d93-ba68-785e58329b37 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1426.208742] env[60764]: DEBUG oslo.service.loopingcall [None req-d704c046-69ae-4d93-ba68-785e58329b37 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1426.208946] env[60764]: DEBUG nova.compute.manager [-] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1426.209052] env[60764]: DEBUG nova.network.neutron [-] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1426.211731] env[60764]: DEBUG nova.compute.manager [None req-1b7fa0df-ee2d-4ed8-a9f7-23f126c65601 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 8d616729-866d-4ebb-8bc7-cf2172b70382] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1426.235042] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1b7fa0df-ee2d-4ed8-a9f7-23f126c65601 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "8d616729-866d-4ebb-8bc7-cf2172b70382" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.848s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1426.237077] env[60764]: DEBUG nova.network.neutron [-] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1426.244343] env[60764]: INFO nova.compute.manager [-] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] Took 0.04 seconds to deallocate network for instance. [ 1426.249507] env[60764]: DEBUG nova.compute.manager [None req-ef83ffa2-ac55-4d6b-bd45-18ad1cc60e83 tempest-AttachVolumeNegativeTest-1914834048 tempest-AttachVolumeNegativeTest-1914834048-project-member] [instance: 5ec9deb5-ad1c-4908-afc8-8e72f8b0cb85] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1426.273962] env[60764]: DEBUG nova.compute.manager [None req-ef83ffa2-ac55-4d6b-bd45-18ad1cc60e83 tempest-AttachVolumeNegativeTest-1914834048 tempest-AttachVolumeNegativeTest-1914834048-project-member] [instance: 5ec9deb5-ad1c-4908-afc8-8e72f8b0cb85] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1426.311959] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ef83ffa2-ac55-4d6b-bd45-18ad1cc60e83 tempest-AttachVolumeNegativeTest-1914834048 tempest-AttachVolumeNegativeTest-1914834048-project-member] Lock "5ec9deb5-ad1c-4908-afc8-8e72f8b0cb85" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 225.387s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1426.321277] env[60764]: DEBUG nova.compute.manager [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1426.356148] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d704c046-69ae-4d93-ba68-785e58329b37 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "dfd3e3af-90c9-420b-81ec-e9115c519016" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.195s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1426.359849] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "dfd3e3af-90c9-420b-81ec-e9115c519016" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 329.983s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1426.359952] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: dfd3e3af-90c9-420b-81ec-e9115c519016] During sync_power_state the instance has a pending task (deleting). Skip. [ 1426.360105] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "dfd3e3af-90c9-420b-81ec-e9115c519016" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1426.375484] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1426.375484] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1426.376781] env[60764]: INFO nova.compute.claims [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1426.623686] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2be671be-990c-4e44-8e68-85a1983e5333 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1426.631350] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41b328da-b153-455d-b335-ab634d4826e3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1426.664047] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cf430c7-2177-4084-880e-64e036fab796 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1426.670995] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6ea869f-8f84-459d-a249-1bdb525c125c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1426.684418] env[60764]: DEBUG nova.compute.provider_tree [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1426.692305] env[60764]: DEBUG nova.scheduler.client.report [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1426.706734] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.331s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1426.707270] env[60764]: DEBUG nova.compute.manager [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1426.742052] env[60764]: DEBUG nova.compute.utils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1426.745061] env[60764]: DEBUG nova.compute.manager [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1426.745061] env[60764]: DEBUG nova.network.neutron [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1426.751717] env[60764]: DEBUG nova.compute.manager [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1426.797698] env[60764]: DEBUG nova.policy [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2016b787a58742daae5db8b8e2e3cc4f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4865cb5e5dc64c73a27ef07294e321ab', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1426.809530] env[60764]: DEBUG nova.compute.manager [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1426.836218] env[60764]: DEBUG nova.virt.hardware [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1426.836500] env[60764]: DEBUG nova.virt.hardware [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1426.836701] env[60764]: DEBUG nova.virt.hardware [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1426.836916] env[60764]: DEBUG nova.virt.hardware [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1426.837143] env[60764]: DEBUG nova.virt.hardware [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1426.837348] env[60764]: DEBUG nova.virt.hardware [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1426.837591] env[60764]: DEBUG nova.virt.hardware [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1426.837779] env[60764]: DEBUG nova.virt.hardware [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1426.838009] env[60764]: DEBUG nova.virt.hardware [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1426.838212] env[60764]: DEBUG nova.virt.hardware [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1426.838425] env[60764]: DEBUG nova.virt.hardware [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1426.839386] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1951a84-fd85-4df9-8bab-3aa067aff4dc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1426.848093] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb7d0d1d-8497-4aed-b457-96eee7e6a463 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1427.140296] env[60764]: DEBUG nova.network.neutron [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Successfully created port: 9afe84c6-e824-4a12-96b6-038e03802487 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1427.763321] env[60764]: DEBUG nova.network.neutron [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Successfully updated port: 9afe84c6-e824-4a12-96b6-038e03802487 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1427.775982] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Acquiring lock "refresh_cache-bf522599-8aa5-411a-96dd-8bd8328d9156" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1427.776157] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Acquired lock "refresh_cache-bf522599-8aa5-411a-96dd-8bd8328d9156" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1427.776320] env[60764]: DEBUG nova.network.neutron [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1427.816788] env[60764]: DEBUG nova.network.neutron [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1427.979738] env[60764]: DEBUG nova.network.neutron [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Updating instance_info_cache with network_info: [{"id": "9afe84c6-e824-4a12-96b6-038e03802487", "address": "fa:16:3e:16:43:65", "network": {"id": "75d04ed2-35e0-4778-a364-7734da81b956", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1413201636-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4865cb5e5dc64c73a27ef07294e321ab", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "271fe7a0-dfd7-409b-920a-cf83ef1a86a3", "external-id": "nsx-vlan-transportzone-728", "segmentation_id": 728, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9afe84c6-e8", "ovs_interfaceid": "9afe84c6-e824-4a12-96b6-038e03802487", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1427.993479] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Releasing lock "refresh_cache-bf522599-8aa5-411a-96dd-8bd8328d9156" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1427.993771] env[60764]: DEBUG nova.compute.manager [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Instance network_info: |[{"id": "9afe84c6-e824-4a12-96b6-038e03802487", "address": "fa:16:3e:16:43:65", "network": {"id": "75d04ed2-35e0-4778-a364-7734da81b956", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1413201636-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4865cb5e5dc64c73a27ef07294e321ab", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "271fe7a0-dfd7-409b-920a-cf83ef1a86a3", "external-id": "nsx-vlan-transportzone-728", "segmentation_id": 728, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9afe84c6-e8", "ovs_interfaceid": "9afe84c6-e824-4a12-96b6-038e03802487", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1427.994176] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:16:43:65', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '271fe7a0-dfd7-409b-920a-cf83ef1a86a3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9afe84c6-e824-4a12-96b6-038e03802487', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1428.001827] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Creating folder: Project (4865cb5e5dc64c73a27ef07294e321ab). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1428.002384] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-48f86cef-3f7a-4e85-8f86-5f1286a11e21 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1428.014213] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Created folder: Project (4865cb5e5dc64c73a27ef07294e321ab) in parent group-v449629. [ 1428.014213] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Creating folder: Instances. Parent ref: group-v449719. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1428.014213] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a36b1138-f95a-405f-97ac-2cc32b0b0e6a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1428.023283] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Created folder: Instances in parent group-v449719. [ 1428.023504] env[60764]: DEBUG oslo.service.loopingcall [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1428.023684] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1428.023888] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a20b2085-f839-4742-a248-5e7aee07f4c8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1428.042841] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1428.042841] env[60764]: value = "task-2204991" [ 1428.042841] env[60764]: _type = "Task" [ 1428.042841] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1428.049954] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204991, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1428.100296] env[60764]: DEBUG nova.compute.manager [req-e2c4da89-1777-46ed-87a8-e2eec1f30fed req-513f54d0-5430-4236-9854-2eb2a4255c62 service nova] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Received event network-vif-plugged-9afe84c6-e824-4a12-96b6-038e03802487 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1428.100527] env[60764]: DEBUG oslo_concurrency.lockutils [req-e2c4da89-1777-46ed-87a8-e2eec1f30fed req-513f54d0-5430-4236-9854-2eb2a4255c62 service nova] Acquiring lock "bf522599-8aa5-411a-96dd-8bd8328d9156-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1428.100752] env[60764]: DEBUG oslo_concurrency.lockutils [req-e2c4da89-1777-46ed-87a8-e2eec1f30fed req-513f54d0-5430-4236-9854-2eb2a4255c62 service nova] Lock "bf522599-8aa5-411a-96dd-8bd8328d9156-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1428.100947] env[60764]: DEBUG oslo_concurrency.lockutils [req-e2c4da89-1777-46ed-87a8-e2eec1f30fed req-513f54d0-5430-4236-9854-2eb2a4255c62 service nova] Lock "bf522599-8aa5-411a-96dd-8bd8328d9156-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1428.101651] env[60764]: DEBUG nova.compute.manager [req-e2c4da89-1777-46ed-87a8-e2eec1f30fed req-513f54d0-5430-4236-9854-2eb2a4255c62 service nova] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] No waiting events found dispatching network-vif-plugged-9afe84c6-e824-4a12-96b6-038e03802487 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1428.101651] env[60764]: WARNING nova.compute.manager [req-e2c4da89-1777-46ed-87a8-e2eec1f30fed req-513f54d0-5430-4236-9854-2eb2a4255c62 service nova] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Received unexpected event network-vif-plugged-9afe84c6-e824-4a12-96b6-038e03802487 for instance with vm_state building and task_state spawning. [ 1428.101651] env[60764]: DEBUG nova.compute.manager [req-e2c4da89-1777-46ed-87a8-e2eec1f30fed req-513f54d0-5430-4236-9854-2eb2a4255c62 service nova] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Received event network-changed-9afe84c6-e824-4a12-96b6-038e03802487 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1428.101794] env[60764]: DEBUG nova.compute.manager [req-e2c4da89-1777-46ed-87a8-e2eec1f30fed req-513f54d0-5430-4236-9854-2eb2a4255c62 service nova] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Refreshing instance network info cache due to event network-changed-9afe84c6-e824-4a12-96b6-038e03802487. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1428.101955] env[60764]: DEBUG oslo_concurrency.lockutils [req-e2c4da89-1777-46ed-87a8-e2eec1f30fed req-513f54d0-5430-4236-9854-2eb2a4255c62 service nova] Acquiring lock "refresh_cache-bf522599-8aa5-411a-96dd-8bd8328d9156" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1428.102161] env[60764]: DEBUG oslo_concurrency.lockutils [req-e2c4da89-1777-46ed-87a8-e2eec1f30fed req-513f54d0-5430-4236-9854-2eb2a4255c62 service nova] Acquired lock "refresh_cache-bf522599-8aa5-411a-96dd-8bd8328d9156" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1428.102354] env[60764]: DEBUG nova.network.neutron [req-e2c4da89-1777-46ed-87a8-e2eec1f30fed req-513f54d0-5430-4236-9854-2eb2a4255c62 service nova] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Refreshing network info cache for port 9afe84c6-e824-4a12-96b6-038e03802487 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1428.414098] env[60764]: DEBUG nova.network.neutron [req-e2c4da89-1777-46ed-87a8-e2eec1f30fed req-513f54d0-5430-4236-9854-2eb2a4255c62 service nova] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Updated VIF entry in instance network info cache for port 9afe84c6-e824-4a12-96b6-038e03802487. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1428.414481] env[60764]: DEBUG nova.network.neutron [req-e2c4da89-1777-46ed-87a8-e2eec1f30fed req-513f54d0-5430-4236-9854-2eb2a4255c62 service nova] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Updating instance_info_cache with network_info: [{"id": "9afe84c6-e824-4a12-96b6-038e03802487", "address": "fa:16:3e:16:43:65", "network": {"id": "75d04ed2-35e0-4778-a364-7734da81b956", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1413201636-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "4865cb5e5dc64c73a27ef07294e321ab", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "271fe7a0-dfd7-409b-920a-cf83ef1a86a3", "external-id": "nsx-vlan-transportzone-728", "segmentation_id": 728, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9afe84c6-e8", "ovs_interfaceid": "9afe84c6-e824-4a12-96b6-038e03802487", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1428.424077] env[60764]: DEBUG oslo_concurrency.lockutils [req-e2c4da89-1777-46ed-87a8-e2eec1f30fed req-513f54d0-5430-4236-9854-2eb2a4255c62 service nova] Releasing lock "refresh_cache-bf522599-8aa5-411a-96dd-8bd8328d9156" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1428.552771] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204991, 'name': CreateVM_Task, 'duration_secs': 0.323604} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1428.552937] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1428.553577] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1428.553741] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1428.554081] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1428.554581] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a72eaaef-7c9b-4afd-9eb8-b1414247df09 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1428.559043] env[60764]: DEBUG oslo_vmware.api [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Waiting for the task: (returnval){ [ 1428.559043] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]525ec318-bb5f-e658-6a79-25a2d4218105" [ 1428.559043] env[60764]: _type = "Task" [ 1428.559043] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1428.566168] env[60764]: DEBUG oslo_vmware.api [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]525ec318-bb5f-e658-6a79-25a2d4218105, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1429.069041] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1429.069355] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1429.069562] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1439.323442] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ece7041d-64b9-439f-ab83-7a083cf0c8dd tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Acquiring lock "bf522599-8aa5-411a-96dd-8bd8328d9156" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1459.331088] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1459.331442] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1459.331442] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1459.355085] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1459.355291] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1459.355431] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1459.355564] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1459.355689] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1459.355812] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 6397ff19-1385-4e38-b199-666394582582] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1459.355934] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a3199f59-f827-404e-8272-296129096180] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1459.356062] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1459.356183] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1459.356299] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1459.356414] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1460.329362] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1461.330328] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1461.342084] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1461.342375] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1461.342580] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1461.342750] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1461.344006] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb8ce463-c6f2-4105-81cc-53ed6f703d95 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1461.352933] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a1fe8a9-8a3c-40c4-a475-00475dee5055 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1461.366295] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24966a8c-1823-427b-9033-36409bade219 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1461.372933] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9b34aca-1a29-4059-9be4-4d06ac7bec76 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1461.404533] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181257MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1461.404674] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1461.404863] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1461.480080] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e80dd396-f709-48d7-bc98-159b175f5593 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1461.480269] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ce8f8161-623c-4f88-8846-8f3b5a4ceabe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1461.480398] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aad42e7f-24c2-400e-8a1c-6baae2081e29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1461.480520] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1461.480639] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b3ca6987-3415-4db5-a514-cd66c342eb7f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1461.480757] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6397ff19-1385-4e38-b199-666394582582 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1461.480924] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a3199f59-f827-404e-8272-296129096180 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1461.481085] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c645f7f5-528b-4719-96dd-8e50a46b4261 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1461.481208] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 51512549-4c6e-41d4-98b0-7d1e801a8b69 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1461.481323] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance bf522599-8aa5-411a-96dd-8bd8328d9156 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1461.492275] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1461.502307] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a83f4609-4c09-4056-a840-cd899af93ea3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1461.511830] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e96a6b8e-75b7-4a2f-a838-107603ad8b80 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1461.520611] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7d4b7608-622c-41c8-9532-e216ed41db91 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1461.528975] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6af78199-5a15-4ed8-94b3-abc98dbffe37 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1461.537873] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2ea05216-40c5-4482-a1d8-278f7ea3d28b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1461.547925] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e7742bf9-cd57-4a84-853f-886e5bc5a6b8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1461.556618] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f13299b8-2c86-41a2-b14c-bfd68ab6dd22 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1461.556841] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1461.556989] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1461.761410] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-546bffa7-2ac5-49b6-a02a-703cb91c1a00 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1461.768759] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ead925c5-cc82-4a0d-953e-fc75cc21d648 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1461.798042] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9225fc70-6bca-423d-b5b6-044f5c65c8d3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1461.804941] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa58ace3-117c-420e-9256-7e26df2dd92c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1461.818554] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1461.826447] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1461.841864] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1461.842131] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.437s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1464.841525] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1464.841897] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1465.329897] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1467.325823] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1467.326093] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1469.330605] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1469.330912] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1469.330979] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1474.167114] env[60764]: WARNING oslo_vmware.rw_handles [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1474.167114] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1474.167114] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1474.167114] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1474.167114] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1474.167114] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1474.167114] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1474.167114] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1474.167114] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1474.167114] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1474.167114] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1474.167114] env[60764]: ERROR oslo_vmware.rw_handles [ 1474.167822] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/d43b1825-62e3-40e7-a191-a2a1b7be58bb/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1474.169489] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1474.169752] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Copying Virtual Disk [datastore2] vmware_temp/d43b1825-62e3-40e7-a191-a2a1b7be58bb/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/d43b1825-62e3-40e7-a191-a2a1b7be58bb/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1474.170052] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-68f05448-7017-4bbe-a5a3-3e8e946174a6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1474.177788] env[60764]: DEBUG oslo_vmware.api [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Waiting for the task: (returnval){ [ 1474.177788] env[60764]: value = "task-2204992" [ 1474.177788] env[60764]: _type = "Task" [ 1474.177788] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1474.185855] env[60764]: DEBUG oslo_vmware.api [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Task: {'id': task-2204992, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1474.687513] env[60764]: DEBUG oslo_vmware.exceptions [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1474.687802] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1474.688388] env[60764]: ERROR nova.compute.manager [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1474.688388] env[60764]: Faults: ['InvalidArgument'] [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] Traceback (most recent call last): [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] yield resources [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] self.driver.spawn(context, instance, image_meta, [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] self._fetch_image_if_missing(context, vi) [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] image_cache(vi, tmp_image_ds_loc) [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] vm_util.copy_virtual_disk( [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] session._wait_for_task(vmdk_copy_task) [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] return self.wait_for_task(task_ref) [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] return evt.wait() [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] result = hub.switch() [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] return self.greenlet.switch() [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] self.f(*self.args, **self.kw) [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] raise exceptions.translate_fault(task_info.error) [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] Faults: ['InvalidArgument'] [ 1474.688388] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] [ 1474.689438] env[60764]: INFO nova.compute.manager [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Terminating instance [ 1474.690301] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1474.690524] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1474.690762] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6cd77bd3-d0a5-4c69-aa84-ee99eac67277 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1474.692970] env[60764]: DEBUG nova.compute.manager [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1474.694116] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1474.694116] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2030d8a6-c3a6-4d86-b24e-b31465246a34 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1474.701034] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1474.701156] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6dee5151-adc4-4435-a124-0498229c443f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1474.703442] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1474.703622] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1474.704600] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e34ec1a0-3c75-4d1b-90fd-feb164e6f4fd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1474.709222] env[60764]: DEBUG oslo_vmware.api [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Waiting for the task: (returnval){ [ 1474.709222] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]528939c7-22fd-9c7f-48e8-7f982b62f3c7" [ 1474.709222] env[60764]: _type = "Task" [ 1474.709222] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1474.716578] env[60764]: DEBUG oslo_vmware.api [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]528939c7-22fd-9c7f-48e8-7f982b62f3c7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1474.817798] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1474.818068] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1474.818440] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Deleting the datastore file [datastore2] e80dd396-f709-48d7-bc98-159b175f5593 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1474.818731] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5b30ae3c-235f-4460-81d8-994565d62ab8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1474.825523] env[60764]: DEBUG oslo_vmware.api [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Waiting for the task: (returnval){ [ 1474.825523] env[60764]: value = "task-2204994" [ 1474.825523] env[60764]: _type = "Task" [ 1474.825523] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1474.833385] env[60764]: DEBUG oslo_vmware.api [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Task: {'id': task-2204994, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1475.219000] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1475.220047] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Creating directory with path [datastore2] vmware_temp/20ab1f58-9631-4e14-a224-2b5eb311af56/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1475.220047] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b5eb14c3-2a28-47c4-bafe-5e40f92087be {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1475.230660] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Created directory with path [datastore2] vmware_temp/20ab1f58-9631-4e14-a224-2b5eb311af56/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1475.230839] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Fetch image to [datastore2] vmware_temp/20ab1f58-9631-4e14-a224-2b5eb311af56/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1475.231010] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/20ab1f58-9631-4e14-a224-2b5eb311af56/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1475.231729] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f76781a4-c9ef-4a32-a950-9a29ae075047 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1475.237660] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a306e168-2ee9-4079-9a49-24a1131371e8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1475.246174] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80db6a9a-5271-4086-86ea-35adf0c773b2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1475.277229] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16e47a41-4a57-486c-bfef-9c2215596058 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1475.282210] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-36d37296-0de5-48d5-bed8-0759a96cecf9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1475.302245] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1475.333913] env[60764]: DEBUG oslo_vmware.api [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Task: {'id': task-2204994, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084569} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1475.334866] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1475.334866] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1475.334866] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1475.334866] env[60764]: INFO nova.compute.manager [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1475.336754] env[60764]: DEBUG nova.compute.claims [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1475.336918] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1475.337143] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1475.455805] env[60764]: DEBUG oslo_vmware.rw_handles [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/20ab1f58-9631-4e14-a224-2b5eb311af56/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1475.517355] env[60764]: DEBUG oslo_vmware.rw_handles [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1475.517518] env[60764]: DEBUG oslo_vmware.rw_handles [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/20ab1f58-9631-4e14-a224-2b5eb311af56/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1475.651363] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e0935c6-3f29-4675-9ccd-3f4d20c62295 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1475.659289] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52b7eb88-90c1-4b36-a6ff-6a0bfa9083ff {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1475.689850] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8591e56b-cacf-4e1f-9d79-e2f63a960107 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1475.697209] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2d1845d-051a-49c9-8c67-e9cd0e7e2791 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1475.710618] env[60764]: DEBUG nova.compute.provider_tree [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1475.721971] env[60764]: DEBUG nova.scheduler.client.report [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1475.737022] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.399s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1475.737022] env[60764]: ERROR nova.compute.manager [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1475.737022] env[60764]: Faults: ['InvalidArgument'] [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] Traceback (most recent call last): [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] self.driver.spawn(context, instance, image_meta, [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] self._fetch_image_if_missing(context, vi) [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] image_cache(vi, tmp_image_ds_loc) [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] vm_util.copy_virtual_disk( [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] session._wait_for_task(vmdk_copy_task) [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] return self.wait_for_task(task_ref) [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] return evt.wait() [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] result = hub.switch() [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] return self.greenlet.switch() [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] self.f(*self.args, **self.kw) [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] raise exceptions.translate_fault(task_info.error) [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] Faults: ['InvalidArgument'] [ 1475.737022] env[60764]: ERROR nova.compute.manager [instance: e80dd396-f709-48d7-bc98-159b175f5593] [ 1475.737961] env[60764]: DEBUG nova.compute.utils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1475.738538] env[60764]: DEBUG nova.compute.manager [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Build of instance e80dd396-f709-48d7-bc98-159b175f5593 was re-scheduled: A specified parameter was not correct: fileType [ 1475.738538] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1475.738904] env[60764]: DEBUG nova.compute.manager [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1475.739089] env[60764]: DEBUG nova.compute.manager [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1475.739255] env[60764]: DEBUG nova.compute.manager [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1475.739426] env[60764]: DEBUG nova.network.neutron [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1476.404412] env[60764]: DEBUG nova.network.neutron [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1476.419271] env[60764]: INFO nova.compute.manager [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Took 0.68 seconds to deallocate network for instance. [ 1476.515892] env[60764]: INFO nova.scheduler.client.report [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Deleted allocations for instance e80dd396-f709-48d7-bc98-159b175f5593 [ 1476.541837] env[60764]: DEBUG oslo_concurrency.lockutils [None req-9baba915-8995-4648-bf06-1c1da3b93b31 tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Lock "e80dd396-f709-48d7-bc98-159b175f5593" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 686.912s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1476.545223] env[60764]: DEBUG oslo_concurrency.lockutils [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Lock "e80dd396-f709-48d7-bc98-159b175f5593" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 489.522s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1476.545223] env[60764]: DEBUG oslo_concurrency.lockutils [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Acquiring lock "e80dd396-f709-48d7-bc98-159b175f5593-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1476.545223] env[60764]: DEBUG oslo_concurrency.lockutils [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Lock "e80dd396-f709-48d7-bc98-159b175f5593-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1476.545223] env[60764]: DEBUG oslo_concurrency.lockutils [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Lock "e80dd396-f709-48d7-bc98-159b175f5593-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1476.547093] env[60764]: INFO nova.compute.manager [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Terminating instance [ 1476.548767] env[60764]: DEBUG oslo_concurrency.lockutils [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Acquiring lock "refresh_cache-e80dd396-f709-48d7-bc98-159b175f5593" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1476.548937] env[60764]: DEBUG oslo_concurrency.lockutils [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Acquired lock "refresh_cache-e80dd396-f709-48d7-bc98-159b175f5593" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1476.549124] env[60764]: DEBUG nova.network.neutron [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1476.561644] env[60764]: DEBUG nova.compute.manager [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1476.584420] env[60764]: DEBUG nova.network.neutron [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1476.624467] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1476.624723] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1476.626271] env[60764]: INFO nova.compute.claims [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1476.703794] env[60764]: DEBUG nova.network.neutron [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1476.718818] env[60764]: DEBUG oslo_concurrency.lockutils [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Releasing lock "refresh_cache-e80dd396-f709-48d7-bc98-159b175f5593" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1476.718818] env[60764]: DEBUG nova.compute.manager [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1476.718818] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1476.718818] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-725b3a1b-7a33-4e1a-90d5-6dfa2d679e55 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1476.729157] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6d4d251-729e-472a-88dd-eaeaeb0ca7b6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1476.761996] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e80dd396-f709-48d7-bc98-159b175f5593 could not be found. [ 1476.762223] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1476.762401] env[60764]: INFO nova.compute.manager [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1476.762658] env[60764]: DEBUG oslo.service.loopingcall [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1476.765183] env[60764]: DEBUG nova.compute.manager [-] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1476.765291] env[60764]: DEBUG nova.network.neutron [-] [instance: e80dd396-f709-48d7-bc98-159b175f5593] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1476.788561] env[60764]: DEBUG nova.network.neutron [-] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1476.799028] env[60764]: DEBUG nova.network.neutron [-] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1476.812263] env[60764]: INFO nova.compute.manager [-] [instance: e80dd396-f709-48d7-bc98-159b175f5593] Took 0.05 seconds to deallocate network for instance. [ 1476.934330] env[60764]: DEBUG oslo_concurrency.lockutils [None req-def3b878-9730-4ddb-b4da-9c9ea434329f tempest-ServersTestMultiNic-368756324 tempest-ServersTestMultiNic-368756324-project-member] Lock "e80dd396-f709-48d7-bc98-159b175f5593" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.390s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1476.935450] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "e80dd396-f709-48d7-bc98-159b175f5593" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 380.558s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1476.935538] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e80dd396-f709-48d7-bc98-159b175f5593] During sync_power_state the instance has a pending task (deleting). Skip. [ 1476.935730] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "e80dd396-f709-48d7-bc98-159b175f5593" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1476.997523] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85d758ae-7311-49b8-a28b-cddfae7a0547 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1477.009681] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40b08563-d96e-4959-b0ef-44b8e4bf7722 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1477.041642] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27b27758-5258-4cb0-b419-18ab6ca6385c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1477.049443] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ddfd7c6-5a72-4778-b10a-61760b63095f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1477.062474] env[60764]: DEBUG nova.compute.provider_tree [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1477.072531] env[60764]: DEBUG nova.scheduler.client.report [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1477.086046] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.461s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1477.086536] env[60764]: DEBUG nova.compute.manager [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1477.132854] env[60764]: DEBUG nova.compute.utils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1477.134902] env[60764]: DEBUG nova.compute.manager [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1477.135458] env[60764]: DEBUG nova.network.neutron [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1477.146555] env[60764]: DEBUG nova.compute.manager [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1477.200700] env[60764]: DEBUG nova.policy [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2a004b099c454915974dd5378f8d966e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3925a62cc1844efca8b4186497fa595c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1477.222021] env[60764]: DEBUG nova.compute.manager [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1477.251312] env[60764]: DEBUG nova.virt.hardware [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1477.251312] env[60764]: DEBUG nova.virt.hardware [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1477.251312] env[60764]: DEBUG nova.virt.hardware [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1477.251312] env[60764]: DEBUG nova.virt.hardware [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1477.251312] env[60764]: DEBUG nova.virt.hardware [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1477.251312] env[60764]: DEBUG nova.virt.hardware [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1477.251312] env[60764]: DEBUG nova.virt.hardware [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1477.251312] env[60764]: DEBUG nova.virt.hardware [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1477.251312] env[60764]: DEBUG nova.virt.hardware [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1477.251312] env[60764]: DEBUG nova.virt.hardware [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1477.251312] env[60764]: DEBUG nova.virt.hardware [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1477.251312] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56904fc0-ad2b-497c-9adf-69f754f6a67f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1477.261477] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92d61e68-b91f-40bb-b17d-82c72513b758 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1477.544872] env[60764]: DEBUG nova.network.neutron [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Successfully created port: b0bbd4f7-32d8-42df-90ed-f4a47a1ed7f5 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1478.578674] env[60764]: DEBUG nova.network.neutron [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Successfully updated port: b0bbd4f7-32d8-42df-90ed-f4a47a1ed7f5 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1478.590751] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Acquiring lock "refresh_cache-3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1478.590892] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Acquired lock "refresh_cache-3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1478.591055] env[60764]: DEBUG nova.network.neutron [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1478.630888] env[60764]: DEBUG nova.network.neutron [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1478.785972] env[60764]: DEBUG nova.network.neutron [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Updating instance_info_cache with network_info: [{"id": "b0bbd4f7-32d8-42df-90ed-f4a47a1ed7f5", "address": "fa:16:3e:38:0f:37", "network": {"id": "fe912966-7dd2-437b-83ad-e49871117502", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-564062116-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3925a62cc1844efca8b4186497fa595c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9be6786-e9a7-4138-b7b5-b7696f6cb1e1", "external-id": "nsx-vlan-transportzone-626", "segmentation_id": 626, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb0bbd4f7-32", "ovs_interfaceid": "b0bbd4f7-32d8-42df-90ed-f4a47a1ed7f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1478.800295] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Releasing lock "refresh_cache-3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1478.800701] env[60764]: DEBUG nova.compute.manager [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Instance network_info: |[{"id": "b0bbd4f7-32d8-42df-90ed-f4a47a1ed7f5", "address": "fa:16:3e:38:0f:37", "network": {"id": "fe912966-7dd2-437b-83ad-e49871117502", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-564062116-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3925a62cc1844efca8b4186497fa595c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9be6786-e9a7-4138-b7b5-b7696f6cb1e1", "external-id": "nsx-vlan-transportzone-626", "segmentation_id": 626, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb0bbd4f7-32", "ovs_interfaceid": "b0bbd4f7-32d8-42df-90ed-f4a47a1ed7f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1478.801138] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:38:0f:37', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f9be6786-e9a7-4138-b7b5-b7696f6cb1e1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b0bbd4f7-32d8-42df-90ed-f4a47a1ed7f5', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1478.809060] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Creating folder: Project (3925a62cc1844efca8b4186497fa595c). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1478.809568] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5620a5dd-1ec5-40ee-93a1-28d4847b13d9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1478.823013] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Created folder: Project (3925a62cc1844efca8b4186497fa595c) in parent group-v449629. [ 1478.823195] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Creating folder: Instances. Parent ref: group-v449722. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1478.823412] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e4ca080f-6cba-4ac0-9339-940e221f5197 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1478.832527] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Created folder: Instances in parent group-v449722. [ 1478.832744] env[60764]: DEBUG oslo.service.loopingcall [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1478.832913] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1478.833107] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-064d4089-dd9c-4701-96c0-502df83a9076 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1478.851365] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1478.851365] env[60764]: value = "task-2204997" [ 1478.851365] env[60764]: _type = "Task" [ 1478.851365] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1478.858510] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204997, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1479.363042] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2204997, 'name': CreateVM_Task, 'duration_secs': 0.310009} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1479.363042] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1479.363291] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1479.363457] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1479.363772] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1479.364038] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-15399b47-2343-4c7e-898c-95a895869f14 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1479.368399] env[60764]: DEBUG oslo_vmware.api [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Waiting for the task: (returnval){ [ 1479.368399] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]526bc826-6029-f26d-7cba-2ef4ad45059a" [ 1479.368399] env[60764]: _type = "Task" [ 1479.368399] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1479.376852] env[60764]: DEBUG oslo_vmware.api [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]526bc826-6029-f26d-7cba-2ef4ad45059a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1479.879407] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1479.879788] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1479.879847] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1480.362773] env[60764]: DEBUG nova.compute.manager [req-0b9cad68-f45d-415d-9602-9dc8a511bb94 req-0c36894c-df54-4290-a903-b6b3d6210c11 service nova] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Received event network-vif-plugged-b0bbd4f7-32d8-42df-90ed-f4a47a1ed7f5 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1480.363018] env[60764]: DEBUG oslo_concurrency.lockutils [req-0b9cad68-f45d-415d-9602-9dc8a511bb94 req-0c36894c-df54-4290-a903-b6b3d6210c11 service nova] Acquiring lock "3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1480.363225] env[60764]: DEBUG oslo_concurrency.lockutils [req-0b9cad68-f45d-415d-9602-9dc8a511bb94 req-0c36894c-df54-4290-a903-b6b3d6210c11 service nova] Lock "3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1480.363392] env[60764]: DEBUG oslo_concurrency.lockutils [req-0b9cad68-f45d-415d-9602-9dc8a511bb94 req-0c36894c-df54-4290-a903-b6b3d6210c11 service nova] Lock "3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1480.363557] env[60764]: DEBUG nova.compute.manager [req-0b9cad68-f45d-415d-9602-9dc8a511bb94 req-0c36894c-df54-4290-a903-b6b3d6210c11 service nova] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] No waiting events found dispatching network-vif-plugged-b0bbd4f7-32d8-42df-90ed-f4a47a1ed7f5 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1480.363717] env[60764]: WARNING nova.compute.manager [req-0b9cad68-f45d-415d-9602-9dc8a511bb94 req-0c36894c-df54-4290-a903-b6b3d6210c11 service nova] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Received unexpected event network-vif-plugged-b0bbd4f7-32d8-42df-90ed-f4a47a1ed7f5 for instance with vm_state building and task_state spawning. [ 1480.363875] env[60764]: DEBUG nova.compute.manager [req-0b9cad68-f45d-415d-9602-9dc8a511bb94 req-0c36894c-df54-4290-a903-b6b3d6210c11 service nova] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Received event network-changed-b0bbd4f7-32d8-42df-90ed-f4a47a1ed7f5 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1480.364035] env[60764]: DEBUG nova.compute.manager [req-0b9cad68-f45d-415d-9602-9dc8a511bb94 req-0c36894c-df54-4290-a903-b6b3d6210c11 service nova] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Refreshing instance network info cache due to event network-changed-b0bbd4f7-32d8-42df-90ed-f4a47a1ed7f5. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1480.364222] env[60764]: DEBUG oslo_concurrency.lockutils [req-0b9cad68-f45d-415d-9602-9dc8a511bb94 req-0c36894c-df54-4290-a903-b6b3d6210c11 service nova] Acquiring lock "refresh_cache-3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1480.364356] env[60764]: DEBUG oslo_concurrency.lockutils [req-0b9cad68-f45d-415d-9602-9dc8a511bb94 req-0c36894c-df54-4290-a903-b6b3d6210c11 service nova] Acquired lock "refresh_cache-3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1480.364511] env[60764]: DEBUG nova.network.neutron [req-0b9cad68-f45d-415d-9602-9dc8a511bb94 req-0c36894c-df54-4290-a903-b6b3d6210c11 service nova] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Refreshing network info cache for port b0bbd4f7-32d8-42df-90ed-f4a47a1ed7f5 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1480.668670] env[60764]: DEBUG oslo_concurrency.lockutils [None req-49f2cb84-ed61-4a83-8182-9462026fefba tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Acquiring lock "3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1480.686729] env[60764]: DEBUG nova.network.neutron [req-0b9cad68-f45d-415d-9602-9dc8a511bb94 req-0c36894c-df54-4290-a903-b6b3d6210c11 service nova] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Updated VIF entry in instance network info cache for port b0bbd4f7-32d8-42df-90ed-f4a47a1ed7f5. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1480.687081] env[60764]: DEBUG nova.network.neutron [req-0b9cad68-f45d-415d-9602-9dc8a511bb94 req-0c36894c-df54-4290-a903-b6b3d6210c11 service nova] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Updating instance_info_cache with network_info: [{"id": "b0bbd4f7-32d8-42df-90ed-f4a47a1ed7f5", "address": "fa:16:3e:38:0f:37", "network": {"id": "fe912966-7dd2-437b-83ad-e49871117502", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-564062116-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3925a62cc1844efca8b4186497fa595c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9be6786-e9a7-4138-b7b5-b7696f6cb1e1", "external-id": "nsx-vlan-transportzone-626", "segmentation_id": 626, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb0bbd4f7-32", "ovs_interfaceid": "b0bbd4f7-32d8-42df-90ed-f4a47a1ed7f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1480.699257] env[60764]: DEBUG oslo_concurrency.lockutils [req-0b9cad68-f45d-415d-9602-9dc8a511bb94 req-0c36894c-df54-4290-a903-b6b3d6210c11 service nova] Releasing lock "refresh_cache-3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1481.984699] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "a6272c75-92de-45a0-8e3e-82e342f0475c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1481.984699] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "a6272c75-92de-45a0-8e3e-82e342f0475c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1521.332838] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1521.333221] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1521.333221] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1521.378447] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1521.378851] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1521.378851] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1521.379038] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1521.379038] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 6397ff19-1385-4e38-b199-666394582582] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1521.379324] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a3199f59-f827-404e-8272-296129096180] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1521.379470] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1521.379605] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1521.379740] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1521.379872] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1521.380011] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1521.380517] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1523.329781] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1523.342273] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1523.342520] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1523.342687] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1523.342845] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1523.344071] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bc61b8c-972a-4d14-9784-fcc9ca4392a0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1523.353140] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9212689-4d07-47e1-a60f-906c4f0211c8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1523.366807] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b59f8249-e69c-4ab6-a3dc-4c46fde8fedf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1523.373077] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41ec5ee0-5dbb-4170-bbd4-1b131e3d14d7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1523.402470] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181260MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1523.402622] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1523.402813] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1523.476730] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance ce8f8161-623c-4f88-8846-8f3b5a4ceabe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1523.476897] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aad42e7f-24c2-400e-8a1c-6baae2081e29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1523.477037] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1523.477162] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b3ca6987-3415-4db5-a514-cd66c342eb7f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1523.477280] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6397ff19-1385-4e38-b199-666394582582 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1523.477399] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a3199f59-f827-404e-8272-296129096180 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1523.477516] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c645f7f5-528b-4719-96dd-8e50a46b4261 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1523.477632] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 51512549-4c6e-41d4-98b0-7d1e801a8b69 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1523.477746] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance bf522599-8aa5-411a-96dd-8bd8328d9156 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1523.477856] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1523.488182] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a83f4609-4c09-4056-a840-cd899af93ea3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1523.498317] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e96a6b8e-75b7-4a2f-a838-107603ad8b80 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1523.507663] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7d4b7608-622c-41c8-9532-e216ed41db91 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1523.516937] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6af78199-5a15-4ed8-94b3-abc98dbffe37 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1523.525640] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2ea05216-40c5-4482-a1d8-278f7ea3d28b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1523.534376] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e7742bf9-cd57-4a84-853f-886e5bc5a6b8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1523.542906] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f13299b8-2c86-41a2-b14c-bfd68ab6dd22 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1523.553601] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a6272c75-92de-45a0-8e3e-82e342f0475c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1523.553829] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1523.553977] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1523.636365] env[60764]: WARNING oslo_vmware.rw_handles [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1523.636365] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1523.636365] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1523.636365] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1523.636365] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1523.636365] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1523.636365] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1523.636365] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1523.636365] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1523.636365] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1523.636365] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1523.636365] env[60764]: ERROR oslo_vmware.rw_handles [ 1523.636897] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/20ab1f58-9631-4e14-a224-2b5eb311af56/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1523.638686] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1523.638924] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Copying Virtual Disk [datastore2] vmware_temp/20ab1f58-9631-4e14-a224-2b5eb311af56/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/20ab1f58-9631-4e14-a224-2b5eb311af56/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1523.639206] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-261ff9d7-e24a-430c-a423-9494e4a06d05 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1523.648527] env[60764]: DEBUG oslo_vmware.api [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Waiting for the task: (returnval){ [ 1523.648527] env[60764]: value = "task-2204998" [ 1523.648527] env[60764]: _type = "Task" [ 1523.648527] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1523.656481] env[60764]: DEBUG oslo_vmware.api [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Task: {'id': task-2204998, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1523.762006] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe88768a-7f93-4ae0-aa50-ab851a88a0e2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1523.769451] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9acb126b-8c52-44a2-8b46-62a96db4fbf1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1523.799504] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8545e628-42a5-428b-8503-5e975da637b3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1523.806807] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86f5ccdd-0feb-4e7e-a2b7-71d984c1b9ac {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1523.820110] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1523.828187] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1523.842252] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1523.842476] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.440s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1524.158780] env[60764]: DEBUG oslo_vmware.exceptions [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1524.159093] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1524.159685] env[60764]: ERROR nova.compute.manager [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1524.159685] env[60764]: Faults: ['InvalidArgument'] [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Traceback (most recent call last): [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] yield resources [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] self.driver.spawn(context, instance, image_meta, [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] self._fetch_image_if_missing(context, vi) [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] image_cache(vi, tmp_image_ds_loc) [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] vm_util.copy_virtual_disk( [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] session._wait_for_task(vmdk_copy_task) [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] return self.wait_for_task(task_ref) [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] return evt.wait() [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] result = hub.switch() [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] return self.greenlet.switch() [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] self.f(*self.args, **self.kw) [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] raise exceptions.translate_fault(task_info.error) [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Faults: ['InvalidArgument'] [ 1524.159685] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] [ 1524.160606] env[60764]: INFO nova.compute.manager [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Terminating instance [ 1524.161547] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1524.161764] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1524.161998] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-05cf7919-c8b8-4e61-ab8d-6b5c6a63e707 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1524.164352] env[60764]: DEBUG nova.compute.manager [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1524.164546] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1524.165293] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3287290b-eeb5-4e60-b1da-962d313b75be {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1524.172034] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1524.172304] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-72a18d00-34e3-4283-a770-ddcd6444cafb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1524.174648] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1524.174822] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1524.175504] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4f3705ce-a5c5-4f80-b6dd-c74804cdc700 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1524.182117] env[60764]: DEBUG oslo_vmware.api [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Waiting for the task: (returnval){ [ 1524.182117] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52df8eda-fe44-2446-f1e0-c15b6dc669d6" [ 1524.182117] env[60764]: _type = "Task" [ 1524.182117] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1524.189063] env[60764]: DEBUG oslo_vmware.api [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52df8eda-fe44-2446-f1e0-c15b6dc669d6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1524.238026] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1524.238299] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1524.238456] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Deleting the datastore file [datastore2] ce8f8161-623c-4f88-8846-8f3b5a4ceabe {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1524.238722] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-34eab2dd-6ded-4067-af7b-416d60693fc0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1524.245158] env[60764]: DEBUG oslo_vmware.api [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Waiting for the task: (returnval){ [ 1524.245158] env[60764]: value = "task-2205000" [ 1524.245158] env[60764]: _type = "Task" [ 1524.245158] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1524.252720] env[60764]: DEBUG oslo_vmware.api [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Task: {'id': task-2205000, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1524.692304] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1524.692659] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Creating directory with path [datastore2] vmware_temp/7acb946c-b10c-4728-9972-17ae7c0b519d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1524.692821] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5f9c45f8-c7fe-4c91-aeaa-280f989f7d35 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1524.704425] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Created directory with path [datastore2] vmware_temp/7acb946c-b10c-4728-9972-17ae7c0b519d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1524.704602] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Fetch image to [datastore2] vmware_temp/7acb946c-b10c-4728-9972-17ae7c0b519d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1524.704766] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/7acb946c-b10c-4728-9972-17ae7c0b519d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1524.705470] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc8a788b-7ed4-42b9-a616-445fa54a5f4a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1524.711735] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42f23b82-1cee-430f-84d8-8c99b87cadb1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1524.720274] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5714e3ba-ac81-4e63-adcd-6d80890e60bb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1524.753700] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c2842b6-8140-41ca-8d93-96bb0a046668 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1524.761534] env[60764]: DEBUG oslo_vmware.api [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Task: {'id': task-2205000, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080636} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1524.762000] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1524.762194] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1524.762361] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1524.762527] env[60764]: INFO nova.compute.manager [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1524.764034] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-aac70dda-df4a-4483-8b36-12767e2cefaf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1524.765837] env[60764]: DEBUG nova.compute.claims [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1524.766014] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1524.766233] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1524.787153] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1524.843375] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1524.904869] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1524.906169] env[60764]: ERROR nova.compute.manager [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d. [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Traceback (most recent call last): [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] result = getattr(controller, method)(*args, **kwargs) [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self._get(image_id) [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] resp, body = self.http_client.get(url, headers=header) [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self.request(url, 'GET', **kwargs) [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self._handle_response(resp) [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] raise exc.from_response(resp, resp.content) [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] During handling of the above exception, another exception occurred: [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Traceback (most recent call last): [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] yield resources [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self.driver.spawn(context, instance, image_meta, [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self._fetch_image_if_missing(context, vi) [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] image_fetch(context, vi, tmp_image_ds_loc) [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] images.fetch_image( [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1524.906169] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] metadata = IMAGE_API.get(context, image_ref) [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return session.show(context, image_id, [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] _reraise_translated_image_exception(image_id) [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] raise new_exc.with_traceback(exc_trace) [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] result = getattr(controller, method)(*args, **kwargs) [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self._get(image_id) [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] resp, body = self.http_client.get(url, headers=header) [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self.request(url, 'GET', **kwargs) [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self._handle_response(resp) [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] raise exc.from_response(resp, resp.content) [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] nova.exception.ImageNotAuthorized: Not authorized for image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d. [ 1524.907315] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] [ 1524.907315] env[60764]: INFO nova.compute.manager [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Terminating instance [ 1524.908374] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1524.908374] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1524.908864] env[60764]: DEBUG nova.compute.manager [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1524.909067] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1524.911274] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-96c05f76-f9ed-414e-b05c-16c9df5730e9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1524.914076] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03bbf93e-392b-4e0d-8fb0-e355cd6545aa {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1524.921414] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1524.921671] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7d675238-22ea-49d8-bb4b-c9b4bfa286c2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1524.923844] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1524.924020] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1524.924973] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-230912ba-f8b0-4da1-bbeb-5a7ccfd12132 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1524.931866] env[60764]: DEBUG oslo_vmware.api [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Waiting for the task: (returnval){ [ 1524.931866] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5227f01d-bb98-2139-9341-fa7b0ea02b6a" [ 1524.931866] env[60764]: _type = "Task" [ 1524.931866] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1524.938748] env[60764]: DEBUG oslo_vmware.api [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5227f01d-bb98-2139-9341-fa7b0ea02b6a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1524.995284] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1524.995498] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1524.995673] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Deleting the datastore file [datastore2] aad42e7f-24c2-400e-8a1c-6baae2081e29 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1524.995938] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-374ab4ad-93fd-4aad-97d0-84bf5b6ff069 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.004820] env[60764]: DEBUG oslo_vmware.api [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Waiting for the task: (returnval){ [ 1525.004820] env[60764]: value = "task-2205002" [ 1525.004820] env[60764]: _type = "Task" [ 1525.004820] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1525.013927] env[60764]: DEBUG oslo_vmware.api [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Task: {'id': task-2205002, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1525.043116] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-329b3e7a-419a-47dc-b081-ddefdecc3eee {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.050307] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f8b7051-bf49-4685-971f-773d3a75d465 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.081646] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a658eed8-7e0e-4593-9a74-cd65d6162a4a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.088440] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c968c3cd-19e8-4cd5-9692-c70a381fdad5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.100945] env[60764]: DEBUG nova.compute.provider_tree [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1525.110051] env[60764]: DEBUG nova.scheduler.client.report [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1525.126137] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.360s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1525.126663] env[60764]: ERROR nova.compute.manager [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1525.126663] env[60764]: Faults: ['InvalidArgument'] [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Traceback (most recent call last): [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] self.driver.spawn(context, instance, image_meta, [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] self._fetch_image_if_missing(context, vi) [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] image_cache(vi, tmp_image_ds_loc) [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] vm_util.copy_virtual_disk( [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] session._wait_for_task(vmdk_copy_task) [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] return self.wait_for_task(task_ref) [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] return evt.wait() [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] result = hub.switch() [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] return self.greenlet.switch() [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] self.f(*self.args, **self.kw) [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] raise exceptions.translate_fault(task_info.error) [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Faults: ['InvalidArgument'] [ 1525.126663] env[60764]: ERROR nova.compute.manager [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] [ 1525.127686] env[60764]: DEBUG nova.compute.utils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1525.128707] env[60764]: DEBUG nova.compute.manager [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Build of instance ce8f8161-623c-4f88-8846-8f3b5a4ceabe was re-scheduled: A specified parameter was not correct: fileType [ 1525.128707] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1525.129096] env[60764]: DEBUG nova.compute.manager [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1525.129273] env[60764]: DEBUG nova.compute.manager [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1525.129448] env[60764]: DEBUG nova.compute.manager [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1525.129606] env[60764]: DEBUG nova.network.neutron [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1525.329800] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1525.442810] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1525.443081] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Creating directory with path [datastore2] vmware_temp/4abbfec0-0a0f-44d2-b77f-cf59d5ee8da3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1525.443355] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a00303e1-62bd-4a53-8217-a513667af802 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.454772] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Created directory with path [datastore2] vmware_temp/4abbfec0-0a0f-44d2-b77f-cf59d5ee8da3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1525.454964] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Fetch image to [datastore2] vmware_temp/4abbfec0-0a0f-44d2-b77f-cf59d5ee8da3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1525.455151] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/4abbfec0-0a0f-44d2-b77f-cf59d5ee8da3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1525.455982] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb36f468-a4ac-4297-8889-5130c63e5bef {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.462866] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddc4afeb-a074-4f8d-bed5-db9786915991 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.471776] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf5339e3-53c2-48b9-af85-8cd4ba81c765 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.504221] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70865b2b-276e-49b7-ab29-57a8249a517b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.517713] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b9518611-bee6-4950-880b-3bdead446c43 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.519654] env[60764]: DEBUG oslo_vmware.api [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Task: {'id': task-2205002, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063847} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1525.519986] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1525.520224] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1525.520433] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1525.520650] env[60764]: INFO nova.compute.manager [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1525.522875] env[60764]: DEBUG nova.compute.claims [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1525.523146] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1525.523429] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1525.538478] env[60764]: DEBUG nova.network.neutron [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1525.542164] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1525.556479] env[60764]: INFO nova.compute.manager [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Took 0.43 seconds to deallocate network for instance. [ 1525.606932] env[60764]: DEBUG oslo_vmware.rw_handles [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4abbfec0-0a0f-44d2-b77f-cf59d5ee8da3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1525.675698] env[60764]: DEBUG oslo_vmware.rw_handles [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1525.675951] env[60764]: DEBUG oslo_vmware.rw_handles [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4abbfec0-0a0f-44d2-b77f-cf59d5ee8da3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1525.695008] env[60764]: INFO nova.scheduler.client.report [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Deleted allocations for instance ce8f8161-623c-4f88-8846-8f3b5a4ceabe [ 1525.721703] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b491df87-9a8b-4fcb-beed-03ecb65bf045 tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Lock "ce8f8161-623c-4f88-8846-8f3b5a4ceabe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 679.789s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1525.722486] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e44b8ea1-9a96-4520-aabf-d8f4f797e43b tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Lock "ce8f8161-623c-4f88-8846-8f3b5a4ceabe" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 483.019s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1525.722736] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e44b8ea1-9a96-4520-aabf-d8f4f797e43b tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Acquiring lock "ce8f8161-623c-4f88-8846-8f3b5a4ceabe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1525.722938] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e44b8ea1-9a96-4520-aabf-d8f4f797e43b tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Lock "ce8f8161-623c-4f88-8846-8f3b5a4ceabe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1525.723143] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e44b8ea1-9a96-4520-aabf-d8f4f797e43b tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Lock "ce8f8161-623c-4f88-8846-8f3b5a4ceabe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1525.727288] env[60764]: INFO nova.compute.manager [None req-e44b8ea1-9a96-4520-aabf-d8f4f797e43b tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Terminating instance [ 1525.729097] env[60764]: DEBUG nova.compute.manager [None req-e44b8ea1-9a96-4520-aabf-d8f4f797e43b tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1525.729299] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e44b8ea1-9a96-4520-aabf-d8f4f797e43b tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1525.729764] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-67fc238a-9108-4e06-ae89-62183e5f2df3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.733499] env[60764]: DEBUG nova.compute.manager [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1525.743025] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66432e15-c0b5-4933-83b7-37467700abb3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.777579] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-e44b8ea1-9a96-4520-aabf-d8f4f797e43b tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ce8f8161-623c-4f88-8846-8f3b5a4ceabe could not be found. [ 1525.777808] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e44b8ea1-9a96-4520-aabf-d8f4f797e43b tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1525.778010] env[60764]: INFO nova.compute.manager [None req-e44b8ea1-9a96-4520-aabf-d8f4f797e43b tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1525.778294] env[60764]: DEBUG oslo.service.loopingcall [None req-e44b8ea1-9a96-4520-aabf-d8f4f797e43b tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1525.783052] env[60764]: DEBUG nova.compute.manager [-] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1525.783165] env[60764]: DEBUG nova.network.neutron [-] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1525.794379] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1525.809124] env[60764]: DEBUG nova.network.neutron [-] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1525.817613] env[60764]: INFO nova.compute.manager [-] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] Took 0.03 seconds to deallocate network for instance. [ 1525.878656] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14ca8b76-f9ad-4234-a1f5-e8a0e0a31b5e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.886101] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e83a3af7-eeed-4ed0-a33a-174830d15650 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.919180] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f940de6c-ce2b-409d-aa3d-0ea98be6bce1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.924332] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e44b8ea1-9a96-4520-aabf-d8f4f797e43b tempest-ImagesOneServerNegativeTestJSON-784605606 tempest-ImagesOneServerNegativeTestJSON-784605606-project-member] Lock "ce8f8161-623c-4f88-8846-8f3b5a4ceabe" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.202s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1525.925592] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "ce8f8161-623c-4f88-8846-8f3b5a4ceabe" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 429.548s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1525.925785] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: ce8f8161-623c-4f88-8846-8f3b5a4ceabe] During sync_power_state the instance has a pending task (deleting). Skip. [ 1525.925953] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "ce8f8161-623c-4f88-8846-8f3b5a4ceabe" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1525.930427] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4528c981-2e43-4af8-b29b-ae6cccc88443 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.945545] env[60764]: DEBUG nova.compute.provider_tree [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1525.955409] env[60764]: DEBUG nova.scheduler.client.report [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1525.971189] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.448s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1525.972080] env[60764]: ERROR nova.compute.manager [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d. [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Traceback (most recent call last): [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] result = getattr(controller, method)(*args, **kwargs) [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self._get(image_id) [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] resp, body = self.http_client.get(url, headers=header) [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self.request(url, 'GET', **kwargs) [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self._handle_response(resp) [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] raise exc.from_response(resp, resp.content) [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] During handling of the above exception, another exception occurred: [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Traceback (most recent call last): [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self.driver.spawn(context, instance, image_meta, [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self._fetch_image_if_missing(context, vi) [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] image_fetch(context, vi, tmp_image_ds_loc) [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] images.fetch_image( [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] metadata = IMAGE_API.get(context, image_ref) [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1525.972080] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return session.show(context, image_id, [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] _reraise_translated_image_exception(image_id) [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] raise new_exc.with_traceback(exc_trace) [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] result = getattr(controller, method)(*args, **kwargs) [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self._get(image_id) [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] resp, body = self.http_client.get(url, headers=header) [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self.request(url, 'GET', **kwargs) [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self._handle_response(resp) [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] raise exc.from_response(resp, resp.content) [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] nova.exception.ImageNotAuthorized: Not authorized for image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d. [ 1525.973305] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] [ 1525.973305] env[60764]: DEBUG nova.compute.utils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Not authorized for image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d. {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1525.974350] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.180s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1525.975318] env[60764]: INFO nova.compute.claims [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1525.978208] env[60764]: DEBUG nova.compute.manager [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Build of instance aad42e7f-24c2-400e-8a1c-6baae2081e29 was re-scheduled: Not authorized for image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1525.978672] env[60764]: DEBUG nova.compute.manager [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1525.978842] env[60764]: DEBUG nova.compute.manager [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1525.979062] env[60764]: DEBUG nova.compute.manager [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1525.979169] env[60764]: DEBUG nova.network.neutron [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1526.091060] env[60764]: DEBUG neutronclient.v2_0.client [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60764) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1526.092386] env[60764]: ERROR nova.compute.manager [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Traceback (most recent call last): [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] result = getattr(controller, method)(*args, **kwargs) [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self._get(image_id) [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] resp, body = self.http_client.get(url, headers=header) [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self.request(url, 'GET', **kwargs) [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self._handle_response(resp) [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] raise exc.from_response(resp, resp.content) [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] During handling of the above exception, another exception occurred: [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Traceback (most recent call last): [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self.driver.spawn(context, instance, image_meta, [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self._fetch_image_if_missing(context, vi) [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] image_fetch(context, vi, tmp_image_ds_loc) [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] images.fetch_image( [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] metadata = IMAGE_API.get(context, image_ref) [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1526.092386] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return session.show(context, image_id, [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] _reraise_translated_image_exception(image_id) [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] raise new_exc.with_traceback(exc_trace) [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] result = getattr(controller, method)(*args, **kwargs) [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self._get(image_id) [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] resp, body = self.http_client.get(url, headers=header) [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self.request(url, 'GET', **kwargs) [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self._handle_response(resp) [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] raise exc.from_response(resp, resp.content) [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] nova.exception.ImageNotAuthorized: Not authorized for image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d. [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] During handling of the above exception, another exception occurred: [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Traceback (most recent call last): [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self._build_and_run_instance(context, instance, image, [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] raise exception.RescheduledException( [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] nova.exception.RescheduledException: Build of instance aad42e7f-24c2-400e-8a1c-6baae2081e29 was re-scheduled: Not authorized for image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d. [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] During handling of the above exception, another exception occurred: [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Traceback (most recent call last): [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] ret = obj(*args, **kwargs) [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] exception_handler_v20(status_code, error_body) [ 1526.093506] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] raise client_exc(message=error_message, [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Neutron server returns request_ids: ['req-0b10170f-f1a4-46f0-baad-523aeb5f19a8'] [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] During handling of the above exception, another exception occurred: [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Traceback (most recent call last): [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self._deallocate_network(context, instance, requested_networks) [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self.network_api.deallocate_for_instance( [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] data = neutron.list_ports(**search_opts) [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] ret = obj(*args, **kwargs) [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self.list('ports', self.ports_path, retrieve_all, [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] ret = obj(*args, **kwargs) [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] for r in self._pagination(collection, path, **params): [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] res = self.get(path, params=params) [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] ret = obj(*args, **kwargs) [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self.retry_request("GET", action, body=body, [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] ret = obj(*args, **kwargs) [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self.do_request(method, action, body=body, [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] ret = obj(*args, **kwargs) [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self._handle_fault_response(status_code, replybody, resp) [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] raise exception.Unauthorized() [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] nova.exception.Unauthorized: Not authorized. [ 1526.095324] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] [ 1526.142011] env[60764]: INFO nova.scheduler.client.report [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Deleted allocations for instance aad42e7f-24c2-400e-8a1c-6baae2081e29 [ 1526.162127] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d4b9be43-fa4e-4606-8f18-c14ef2f3c795 tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "aad42e7f-24c2-400e-8a1c-6baae2081e29" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 639.686s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1526.163235] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "aad42e7f-24c2-400e-8a1c-6baae2081e29" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 442.795s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1526.163447] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquiring lock "aad42e7f-24c2-400e-8a1c-6baae2081e29-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1526.163648] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "aad42e7f-24c2-400e-8a1c-6baae2081e29-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1526.163832] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "aad42e7f-24c2-400e-8a1c-6baae2081e29-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1526.165727] env[60764]: INFO nova.compute.manager [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Terminating instance [ 1526.167494] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquiring lock "refresh_cache-aad42e7f-24c2-400e-8a1c-6baae2081e29" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1526.167494] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Acquired lock "refresh_cache-aad42e7f-24c2-400e-8a1c-6baae2081e29" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1526.167697] env[60764]: DEBUG nova.network.neutron [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1526.177027] env[60764]: DEBUG nova.compute.manager [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1526.223379] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1526.249288] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93467f9a-4da4-4428-90c0-13cc1671a086 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1526.256744] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c86ee65-4e77-4f91-a0cb-bd593edebbeb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1526.290551] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7d08e8b-3912-48e7-81dc-7aa6853ef8df {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1526.298101] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b1bc158-e43e-4b50-ad13-f3bf4dd4d111 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1526.311108] env[60764]: DEBUG nova.compute.provider_tree [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1526.320548] env[60764]: DEBUG nova.scheduler.client.report [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1526.333557] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.360s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1526.334060] env[60764]: DEBUG nova.compute.manager [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1526.336657] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.113s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1526.338081] env[60764]: INFO nova.compute.claims [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1526.371988] env[60764]: DEBUG nova.compute.utils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1526.373378] env[60764]: DEBUG nova.compute.manager [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1526.373542] env[60764]: DEBUG nova.network.neutron [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1526.386998] env[60764]: DEBUG nova.compute.manager [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1526.460265] env[60764]: DEBUG nova.policy [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '68b8120ac73640ac85230e4b45c29685', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c8af41aebb6a44da83a21d11b6fd7987', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1526.486796] env[60764]: DEBUG nova.compute.manager [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1526.520662] env[60764]: DEBUG nova.virt.hardware [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1526.520985] env[60764]: DEBUG nova.virt.hardware [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1526.521227] env[60764]: DEBUG nova.virt.hardware [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1526.521554] env[60764]: DEBUG nova.virt.hardware [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1526.521758] env[60764]: DEBUG nova.virt.hardware [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1526.522059] env[60764]: DEBUG nova.virt.hardware [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1526.522379] env[60764]: DEBUG nova.virt.hardware [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1526.522628] env[60764]: DEBUG nova.virt.hardware [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1526.522941] env[60764]: DEBUG nova.virt.hardware [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1526.523151] env[60764]: DEBUG nova.virt.hardware [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1526.523417] env[60764]: DEBUG nova.virt.hardware [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1526.524712] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-936bcc9f-9ef3-4955-875f-3b9add415d26 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1526.535975] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1976684-e8a5-4fc0-9296-2e8eadf977a6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1526.638707] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50d100a2-3756-4713-92bf-d33ca4c7ad88 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1526.648398] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce718eec-b7c3-4e46-942d-ae274bc6e391 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1526.681950] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72a5a348-6062-483a-960a-81da3724db30 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1526.692051] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0d1abaf-247b-4e5f-afb1-092248749b6a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1526.708633] env[60764]: DEBUG nova.compute.provider_tree [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1526.720371] env[60764]: DEBUG nova.scheduler.client.report [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1526.735122] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.398s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1526.735613] env[60764]: DEBUG nova.compute.manager [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1526.759062] env[60764]: DEBUG nova.network.neutron [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Updating instance_info_cache with network_info: [{"id": "71dfa984-60dd-4c44-b1d4-9639276d87e8", "address": "fa:16:3e:40:19:68", "network": {"id": "8b1f536d-33f3-4467-a71d-74e0c1b9982f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "c84a3b13dad24426842b23ff07092e6c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f87a752-ebb0-49a4-a67b-e356fa45b89b", "external-id": "nsx-vlan-transportzone-889", "segmentation_id": 889, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap71dfa984-60", "ovs_interfaceid": "71dfa984-60dd-4c44-b1d4-9639276d87e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1526.771573] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Releasing lock "refresh_cache-aad42e7f-24c2-400e-8a1c-6baae2081e29" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1526.772516] env[60764]: DEBUG nova.compute.manager [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1526.772750] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1526.773706] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d5f952ef-a267-4e65-ab83-ed8a6e7aa2e4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1526.777702] env[60764]: DEBUG nova.compute.utils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1526.779795] env[60764]: DEBUG nova.compute.manager [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1526.779795] env[60764]: DEBUG nova.network.neutron [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1526.785925] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-958532e9-fb65-457f-b301-98d17d4a4433 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1526.797739] env[60764]: DEBUG nova.compute.manager [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1526.817624] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance aad42e7f-24c2-400e-8a1c-6baae2081e29 could not be found. [ 1526.817840] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1526.818262] env[60764]: INFO nova.compute.manager [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1526.818262] env[60764]: DEBUG oslo.service.loopingcall [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1526.818482] env[60764]: DEBUG nova.compute.manager [-] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1526.818577] env[60764]: DEBUG nova.network.neutron [-] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1526.844608] env[60764]: DEBUG nova.policy [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1154fa431dad4ae1ae467fc3ea6206b1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4c4f5a1b557e4c31b54b7f87223a20d8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1526.868925] env[60764]: DEBUG nova.compute.manager [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1526.946035] env[60764]: DEBUG nova.network.neutron [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Successfully created port: 0af289d4-9dd1-4477-ab2c-23550dc3eaf5 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1526.958574] env[60764]: DEBUG nova.virt.hardware [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1526.958817] env[60764]: DEBUG nova.virt.hardware [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1526.959570] env[60764]: DEBUG nova.virt.hardware [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1526.959570] env[60764]: DEBUG nova.virt.hardware [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1526.959570] env[60764]: DEBUG nova.virt.hardware [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1526.959570] env[60764]: DEBUG nova.virt.hardware [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1526.959751] env[60764]: DEBUG nova.virt.hardware [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1526.959851] env[60764]: DEBUG nova.virt.hardware [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1526.959986] env[60764]: DEBUG nova.virt.hardware [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1526.961632] env[60764]: DEBUG nova.virt.hardware [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1526.961740] env[60764]: DEBUG nova.virt.hardware [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1526.965080] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82880c23-2562-4a00-8021-a61968015049 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1526.970977] env[60764]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60764) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1526.971214] env[60764]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-0f6835b7-7c3c-4fdf-aa8c-b46ca3a55d54'] [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1526.972090] env[60764]: ERROR oslo.service.loopingcall [ 1526.974277] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6315d8a-497e-400a-8be9-1d1c02027979 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1526.977846] env[60764]: ERROR nova.compute.manager [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1527.014297] env[60764]: ERROR nova.compute.manager [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Traceback (most recent call last): [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] ret = obj(*args, **kwargs) [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] exception_handler_v20(status_code, error_body) [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] raise client_exc(message=error_message, [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Neutron server returns request_ids: ['req-0f6835b7-7c3c-4fdf-aa8c-b46ca3a55d54'] [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] During handling of the above exception, another exception occurred: [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Traceback (most recent call last): [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self._delete_instance(context, instance, bdms) [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self._shutdown_instance(context, instance, bdms) [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self._try_deallocate_network(context, instance, requested_networks) [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] with excutils.save_and_reraise_exception(): [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self.force_reraise() [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] raise self.value [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] _deallocate_network_with_retries() [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return evt.wait() [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] result = hub.switch() [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self.greenlet.switch() [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] result = func(*self.args, **self.kw) [ 1527.014297] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] result = f(*args, **kwargs) [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self._deallocate_network( [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self.network_api.deallocate_for_instance( [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] data = neutron.list_ports(**search_opts) [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] ret = obj(*args, **kwargs) [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self.list('ports', self.ports_path, retrieve_all, [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] ret = obj(*args, **kwargs) [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] for r in self._pagination(collection, path, **params): [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] res = self.get(path, params=params) [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] ret = obj(*args, **kwargs) [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self.retry_request("GET", action, body=body, [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] ret = obj(*args, **kwargs) [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] return self.do_request(method, action, body=body, [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] ret = obj(*args, **kwargs) [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] self._handle_fault_response(status_code, replybody, resp) [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1527.016182] env[60764]: ERROR nova.compute.manager [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] [ 1527.053039] env[60764]: DEBUG oslo_concurrency.lockutils [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Lock "aad42e7f-24c2-400e-8a1c-6baae2081e29" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.889s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1527.053547] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "aad42e7f-24c2-400e-8a1c-6baae2081e29" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 430.676s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1527.053744] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] During sync_power_state the instance has a pending task (deleting). Skip. [ 1527.053945] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "aad42e7f-24c2-400e-8a1c-6baae2081e29" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1527.115573] env[60764]: INFO nova.compute.manager [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] [instance: aad42e7f-24c2-400e-8a1c-6baae2081e29] Successfully reverted task state from None on failure for instance. [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server [None req-e50b59cf-5d54-44a6-ae72-842082d9740f tempest-DeleteServersAdminTestJSON-1309106710 tempest-DeleteServersAdminTestJSON-1309106710-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-0f6835b7-7c3c-4fdf-aa8c-b46ca3a55d54'] [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server raise self.value [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server raise self.value [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server raise self.value [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1527.119770] env[60764]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server raise self.value [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server raise self.value [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1527.121324] env[60764]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1527.122857] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1527.122857] env[60764]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1527.122857] env[60764]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1527.122857] env[60764]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1527.122857] env[60764]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1527.122857] env[60764]: ERROR oslo_messaging.rpc.server [ 1527.203798] env[60764]: DEBUG nova.network.neutron [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Successfully created port: 29670445-02d3-497a-a37d-349f01f0c443 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1527.329645] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1527.862058] env[60764]: DEBUG nova.compute.manager [req-9d87b06a-1cee-46d6-a9e5-0844e5951ca1 req-732cd620-9e94-4bdb-b73d-1f9df48020b6 service nova] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Received event network-vif-plugged-0af289d4-9dd1-4477-ab2c-23550dc3eaf5 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1527.862290] env[60764]: DEBUG oslo_concurrency.lockutils [req-9d87b06a-1cee-46d6-a9e5-0844e5951ca1 req-732cd620-9e94-4bdb-b73d-1f9df48020b6 service nova] Acquiring lock "a83f4609-4c09-4056-a840-cd899af93ea3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1527.862855] env[60764]: DEBUG oslo_concurrency.lockutils [req-9d87b06a-1cee-46d6-a9e5-0844e5951ca1 req-732cd620-9e94-4bdb-b73d-1f9df48020b6 service nova] Lock "a83f4609-4c09-4056-a840-cd899af93ea3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1527.863119] env[60764]: DEBUG oslo_concurrency.lockutils [req-9d87b06a-1cee-46d6-a9e5-0844e5951ca1 req-732cd620-9e94-4bdb-b73d-1f9df48020b6 service nova] Lock "a83f4609-4c09-4056-a840-cd899af93ea3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1527.863301] env[60764]: DEBUG nova.compute.manager [req-9d87b06a-1cee-46d6-a9e5-0844e5951ca1 req-732cd620-9e94-4bdb-b73d-1f9df48020b6 service nova] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] No waiting events found dispatching network-vif-plugged-0af289d4-9dd1-4477-ab2c-23550dc3eaf5 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1527.863468] env[60764]: WARNING nova.compute.manager [req-9d87b06a-1cee-46d6-a9e5-0844e5951ca1 req-732cd620-9e94-4bdb-b73d-1f9df48020b6 service nova] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Received unexpected event network-vif-plugged-0af289d4-9dd1-4477-ab2c-23550dc3eaf5 for instance with vm_state building and task_state spawning. [ 1527.864645] env[60764]: DEBUG nova.network.neutron [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Successfully updated port: 0af289d4-9dd1-4477-ab2c-23550dc3eaf5 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1527.882502] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Acquiring lock "refresh_cache-a83f4609-4c09-4056-a840-cd899af93ea3" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1527.882654] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Acquired lock "refresh_cache-a83f4609-4c09-4056-a840-cd899af93ea3" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1527.883654] env[60764]: DEBUG nova.network.neutron [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1527.931619] env[60764]: DEBUG nova.network.neutron [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1527.956395] env[60764]: DEBUG nova.compute.manager [req-f34e8f60-b7d5-4b6a-8d60-d29efdda0646 req-f7b20011-7895-48d7-8fb6-85809cb8a409 service nova] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Received event network-vif-plugged-29670445-02d3-497a-a37d-349f01f0c443 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1527.956726] env[60764]: DEBUG oslo_concurrency.lockutils [req-f34e8f60-b7d5-4b6a-8d60-d29efdda0646 req-f7b20011-7895-48d7-8fb6-85809cb8a409 service nova] Acquiring lock "e96a6b8e-75b7-4a2f-a838-107603ad8b80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1527.957036] env[60764]: DEBUG oslo_concurrency.lockutils [req-f34e8f60-b7d5-4b6a-8d60-d29efdda0646 req-f7b20011-7895-48d7-8fb6-85809cb8a409 service nova] Lock "e96a6b8e-75b7-4a2f-a838-107603ad8b80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1527.957729] env[60764]: DEBUG oslo_concurrency.lockutils [req-f34e8f60-b7d5-4b6a-8d60-d29efdda0646 req-f7b20011-7895-48d7-8fb6-85809cb8a409 service nova] Lock "e96a6b8e-75b7-4a2f-a838-107603ad8b80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1527.958365] env[60764]: DEBUG nova.compute.manager [req-f34e8f60-b7d5-4b6a-8d60-d29efdda0646 req-f7b20011-7895-48d7-8fb6-85809cb8a409 service nova] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] No waiting events found dispatching network-vif-plugged-29670445-02d3-497a-a37d-349f01f0c443 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1527.958554] env[60764]: WARNING nova.compute.manager [req-f34e8f60-b7d5-4b6a-8d60-d29efdda0646 req-f7b20011-7895-48d7-8fb6-85809cb8a409 service nova] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Received unexpected event network-vif-plugged-29670445-02d3-497a-a37d-349f01f0c443 for instance with vm_state building and task_state spawning. [ 1527.993242] env[60764]: DEBUG nova.network.neutron [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Successfully updated port: 29670445-02d3-497a-a37d-349f01f0c443 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1528.020121] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "refresh_cache-e96a6b8e-75b7-4a2f-a838-107603ad8b80" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1528.020295] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquired lock "refresh_cache-e96a6b8e-75b7-4a2f-a838-107603ad8b80" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1528.020451] env[60764]: DEBUG nova.network.neutron [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1528.087786] env[60764]: DEBUG nova.network.neutron [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1528.208771] env[60764]: DEBUG nova.network.neutron [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Updating instance_info_cache with network_info: [{"id": "0af289d4-9dd1-4477-ab2c-23550dc3eaf5", "address": "fa:16:3e:34:16:84", "network": {"id": "cddf5377-78f7-4685-9d74-0ac673da750b", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-376784710-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c8af41aebb6a44da83a21d11b6fd7987", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0f096917-a0cf-4add-a9d2-23ca1c723b3b", "external-id": "nsx-vlan-transportzone-894", "segmentation_id": 894, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0af289d4-9d", "ovs_interfaceid": "0af289d4-9dd1-4477-ab2c-23550dc3eaf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1528.219146] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Releasing lock "refresh_cache-a83f4609-4c09-4056-a840-cd899af93ea3" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1528.219430] env[60764]: DEBUG nova.compute.manager [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Instance network_info: |[{"id": "0af289d4-9dd1-4477-ab2c-23550dc3eaf5", "address": "fa:16:3e:34:16:84", "network": {"id": "cddf5377-78f7-4685-9d74-0ac673da750b", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-376784710-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c8af41aebb6a44da83a21d11b6fd7987", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0f096917-a0cf-4add-a9d2-23ca1c723b3b", "external-id": "nsx-vlan-transportzone-894", "segmentation_id": 894, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0af289d4-9d", "ovs_interfaceid": "0af289d4-9dd1-4477-ab2c-23550dc3eaf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1528.219816] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:34:16:84', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0f096917-a0cf-4add-a9d2-23ca1c723b3b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0af289d4-9dd1-4477-ab2c-23550dc3eaf5', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1528.227310] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Creating folder: Project (c8af41aebb6a44da83a21d11b6fd7987). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1528.227847] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d26763f3-1e48-403a-9781-01a61466164e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.238294] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Created folder: Project (c8af41aebb6a44da83a21d11b6fd7987) in parent group-v449629. [ 1528.238482] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Creating folder: Instances. Parent ref: group-v449725. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1528.238703] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-be220ca8-061a-46dd-bdc2-bce05bee0178 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.247123] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Created folder: Instances in parent group-v449725. [ 1528.247338] env[60764]: DEBUG oslo.service.loopingcall [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1528.247509] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1528.247695] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1f1be360-6abe-42df-aa8c-83d1db095493 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.267531] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1528.267531] env[60764]: value = "task-2205005" [ 1528.267531] env[60764]: _type = "Task" [ 1528.267531] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1528.275150] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205005, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1528.275979] env[60764]: DEBUG nova.network.neutron [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Updating instance_info_cache with network_info: [{"id": "29670445-02d3-497a-a37d-349f01f0c443", "address": "fa:16:3e:fd:0e:9b", "network": {"id": "d045d290-b078-42dc-b8c5-cc9de065ce4b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1646899007-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4c4f5a1b557e4c31b54b7f87223a20d8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d177c5b3-a5b1-4c78-854e-7e0dbf341ea1", "external-id": "nsx-vlan-transportzone-54", "segmentation_id": 54, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap29670445-02", "ovs_interfaceid": "29670445-02d3-497a-a37d-349f01f0c443", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1528.288656] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Releasing lock "refresh_cache-e96a6b8e-75b7-4a2f-a838-107603ad8b80" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1528.288945] env[60764]: DEBUG nova.compute.manager [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Instance network_info: |[{"id": "29670445-02d3-497a-a37d-349f01f0c443", "address": "fa:16:3e:fd:0e:9b", "network": {"id": "d045d290-b078-42dc-b8c5-cc9de065ce4b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1646899007-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4c4f5a1b557e4c31b54b7f87223a20d8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d177c5b3-a5b1-4c78-854e-7e0dbf341ea1", "external-id": "nsx-vlan-transportzone-54", "segmentation_id": 54, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap29670445-02", "ovs_interfaceid": "29670445-02d3-497a-a37d-349f01f0c443", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1528.289322] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:fd:0e:9b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd177c5b3-a5b1-4c78-854e-7e0dbf341ea1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '29670445-02d3-497a-a37d-349f01f0c443', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1528.296832] env[60764]: DEBUG oslo.service.loopingcall [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1528.297278] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1528.297494] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1f45cd39-f661-422d-a553-df4207875fac {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.316517] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1528.316517] env[60764]: value = "task-2205006" [ 1528.316517] env[60764]: _type = "Task" [ 1528.316517] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1528.324190] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205006, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1528.777883] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205005, 'name': CreateVM_Task, 'duration_secs': 0.330133} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1528.778062] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1528.778755] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1528.778915] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1528.779246] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1528.779515] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-90b9227a-b0ff-4d2d-94b2-a05b5908d1f9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.784141] env[60764]: DEBUG oslo_vmware.api [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Waiting for the task: (returnval){ [ 1528.784141] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]527f957b-c764-c53e-8704-675066b0386b" [ 1528.784141] env[60764]: _type = "Task" [ 1528.784141] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1528.792032] env[60764]: DEBUG oslo_vmware.api [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]527f957b-c764-c53e-8704-675066b0386b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1528.825820] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205006, 'name': CreateVM_Task, 'duration_secs': 0.319218} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1528.825999] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1528.826660] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1529.294534] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1529.294865] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1529.295024] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1529.295230] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1529.295534] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1529.295785] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b5df017d-2345-4fb5-ac0c-62491e0442fe {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1529.300382] env[60764]: DEBUG oslo_vmware.api [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Waiting for the task: (returnval){ [ 1529.300382] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52007618-b37f-af92-0b78-d4b47b092517" [ 1529.300382] env[60764]: _type = "Task" [ 1529.300382] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1529.310574] env[60764]: DEBUG oslo_vmware.api [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52007618-b37f-af92-0b78-d4b47b092517, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1529.325229] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1529.811250] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1529.811524] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1529.811746] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1529.890616] env[60764]: DEBUG nova.compute.manager [req-f20aab55-599c-4853-a49c-0cb315433e9d req-a16f15c5-b76e-4788-84d0-40f61c151aa5 service nova] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Received event network-changed-0af289d4-9dd1-4477-ab2c-23550dc3eaf5 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1529.890829] env[60764]: DEBUG nova.compute.manager [req-f20aab55-599c-4853-a49c-0cb315433e9d req-a16f15c5-b76e-4788-84d0-40f61c151aa5 service nova] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Refreshing instance network info cache due to event network-changed-0af289d4-9dd1-4477-ab2c-23550dc3eaf5. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1529.891057] env[60764]: DEBUG oslo_concurrency.lockutils [req-f20aab55-599c-4853-a49c-0cb315433e9d req-a16f15c5-b76e-4788-84d0-40f61c151aa5 service nova] Acquiring lock "refresh_cache-a83f4609-4c09-4056-a840-cd899af93ea3" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1529.891201] env[60764]: DEBUG oslo_concurrency.lockutils [req-f20aab55-599c-4853-a49c-0cb315433e9d req-a16f15c5-b76e-4788-84d0-40f61c151aa5 service nova] Acquired lock "refresh_cache-a83f4609-4c09-4056-a840-cd899af93ea3" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1529.891358] env[60764]: DEBUG nova.network.neutron [req-f20aab55-599c-4853-a49c-0cb315433e9d req-a16f15c5-b76e-4788-84d0-40f61c151aa5 service nova] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Refreshing network info cache for port 0af289d4-9dd1-4477-ab2c-23550dc3eaf5 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1529.979671] env[60764]: DEBUG nova.compute.manager [req-ff9a3611-fc7d-4f6b-a6fb-dca59339c2d9 req-3b9e09b0-577f-4322-85ec-14f81216edb8 service nova] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Received event network-changed-29670445-02d3-497a-a37d-349f01f0c443 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1529.979866] env[60764]: DEBUG nova.compute.manager [req-ff9a3611-fc7d-4f6b-a6fb-dca59339c2d9 req-3b9e09b0-577f-4322-85ec-14f81216edb8 service nova] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Refreshing instance network info cache due to event network-changed-29670445-02d3-497a-a37d-349f01f0c443. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1529.980716] env[60764]: DEBUG oslo_concurrency.lockutils [req-ff9a3611-fc7d-4f6b-a6fb-dca59339c2d9 req-3b9e09b0-577f-4322-85ec-14f81216edb8 service nova] Acquiring lock "refresh_cache-e96a6b8e-75b7-4a2f-a838-107603ad8b80" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1529.980934] env[60764]: DEBUG oslo_concurrency.lockutils [req-ff9a3611-fc7d-4f6b-a6fb-dca59339c2d9 req-3b9e09b0-577f-4322-85ec-14f81216edb8 service nova] Acquired lock "refresh_cache-e96a6b8e-75b7-4a2f-a838-107603ad8b80" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1529.981131] env[60764]: DEBUG nova.network.neutron [req-ff9a3611-fc7d-4f6b-a6fb-dca59339c2d9 req-3b9e09b0-577f-4322-85ec-14f81216edb8 service nova] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Refreshing network info cache for port 29670445-02d3-497a-a37d-349f01f0c443 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1530.221858] env[60764]: DEBUG nova.network.neutron [req-f20aab55-599c-4853-a49c-0cb315433e9d req-a16f15c5-b76e-4788-84d0-40f61c151aa5 service nova] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Updated VIF entry in instance network info cache for port 0af289d4-9dd1-4477-ab2c-23550dc3eaf5. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1530.222234] env[60764]: DEBUG nova.network.neutron [req-f20aab55-599c-4853-a49c-0cb315433e9d req-a16f15c5-b76e-4788-84d0-40f61c151aa5 service nova] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Updating instance_info_cache with network_info: [{"id": "0af289d4-9dd1-4477-ab2c-23550dc3eaf5", "address": "fa:16:3e:34:16:84", "network": {"id": "cddf5377-78f7-4685-9d74-0ac673da750b", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-376784710-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c8af41aebb6a44da83a21d11b6fd7987", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0f096917-a0cf-4add-a9d2-23ca1c723b3b", "external-id": "nsx-vlan-transportzone-894", "segmentation_id": 894, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0af289d4-9d", "ovs_interfaceid": "0af289d4-9dd1-4477-ab2c-23550dc3eaf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1530.232363] env[60764]: DEBUG oslo_concurrency.lockutils [req-f20aab55-599c-4853-a49c-0cb315433e9d req-a16f15c5-b76e-4788-84d0-40f61c151aa5 service nova] Releasing lock "refresh_cache-a83f4609-4c09-4056-a840-cd899af93ea3" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1530.298228] env[60764]: DEBUG nova.network.neutron [req-ff9a3611-fc7d-4f6b-a6fb-dca59339c2d9 req-3b9e09b0-577f-4322-85ec-14f81216edb8 service nova] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Updated VIF entry in instance network info cache for port 29670445-02d3-497a-a37d-349f01f0c443. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1530.298604] env[60764]: DEBUG nova.network.neutron [req-ff9a3611-fc7d-4f6b-a6fb-dca59339c2d9 req-3b9e09b0-577f-4322-85ec-14f81216edb8 service nova] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Updating instance_info_cache with network_info: [{"id": "29670445-02d3-497a-a37d-349f01f0c443", "address": "fa:16:3e:fd:0e:9b", "network": {"id": "d045d290-b078-42dc-b8c5-cc9de065ce4b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1646899007-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4c4f5a1b557e4c31b54b7f87223a20d8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d177c5b3-a5b1-4c78-854e-7e0dbf341ea1", "external-id": "nsx-vlan-transportzone-54", "segmentation_id": 54, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap29670445-02", "ovs_interfaceid": "29670445-02d3-497a-a37d-349f01f0c443", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1530.308015] env[60764]: DEBUG oslo_concurrency.lockutils [req-ff9a3611-fc7d-4f6b-a6fb-dca59339c2d9 req-3b9e09b0-577f-4322-85ec-14f81216edb8 service nova] Releasing lock "refresh_cache-e96a6b8e-75b7-4a2f-a838-107603ad8b80" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1530.330088] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1530.330247] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1531.330518] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1544.236250] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquiring lock "73ba3af8-9a29-4c63-9a55-c9879e74239d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1544.236647] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "73ba3af8-9a29-4c63-9a55-c9879e74239d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1551.903239] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8e43acd1-363d-4b2a-9fb6-0bbf318e7330 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Acquiring lock "a83f4609-4c09-4056-a840-cd899af93ea3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1553.362340] env[60764]: DEBUG oslo_concurrency.lockutils [None req-710b6699-635f-48e1-b5b5-191c8a34dba1 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "e96a6b8e-75b7-4a2f-a838-107603ad8b80" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1570.262702] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Acquiring lock "f1940470-82f6-41fb-bd36-96561ad20102" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1570.263043] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "f1940470-82f6-41fb-bd36-96561ad20102" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1573.127776] env[60764]: WARNING oslo_vmware.rw_handles [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1573.127776] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1573.127776] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1573.127776] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1573.127776] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1573.127776] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1573.127776] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1573.127776] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1573.127776] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1573.127776] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1573.127776] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1573.127776] env[60764]: ERROR oslo_vmware.rw_handles [ 1573.128553] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/4abbfec0-0a0f-44d2-b77f-cf59d5ee8da3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1573.130326] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1573.130674] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Copying Virtual Disk [datastore2] vmware_temp/4abbfec0-0a0f-44d2-b77f-cf59d5ee8da3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/4abbfec0-0a0f-44d2-b77f-cf59d5ee8da3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1573.131043] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-960211b0-b0bd-4197-b345-321d05ea73da {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1573.140607] env[60764]: DEBUG oslo_vmware.api [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Waiting for the task: (returnval){ [ 1573.140607] env[60764]: value = "task-2205007" [ 1573.140607] env[60764]: _type = "Task" [ 1573.140607] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1573.149121] env[60764]: DEBUG oslo_vmware.api [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Task: {'id': task-2205007, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1573.651538] env[60764]: DEBUG oslo_vmware.exceptions [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1573.651792] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1573.652426] env[60764]: ERROR nova.compute.manager [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1573.652426] env[60764]: Faults: ['InvalidArgument'] [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Traceback (most recent call last): [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] yield resources [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] self.driver.spawn(context, instance, image_meta, [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] self._fetch_image_if_missing(context, vi) [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] image_cache(vi, tmp_image_ds_loc) [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] vm_util.copy_virtual_disk( [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] session._wait_for_task(vmdk_copy_task) [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] return self.wait_for_task(task_ref) [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] return evt.wait() [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] result = hub.switch() [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] return self.greenlet.switch() [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] self.f(*self.args, **self.kw) [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] raise exceptions.translate_fault(task_info.error) [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Faults: ['InvalidArgument'] [ 1573.652426] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] [ 1573.653836] env[60764]: INFO nova.compute.manager [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Terminating instance [ 1573.654476] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1573.654707] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1573.654993] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3a5b9c98-ce2e-44eb-a9aa-6f254da897dd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1573.657747] env[60764]: DEBUG nova.compute.manager [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1573.657980] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1573.658768] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-274783c6-c11a-4530-a303-230ca3727458 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1573.667052] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1573.668439] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cb0e48df-e03f-4852-ac5b-1ac844a116ce {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1573.670491] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1573.670838] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1573.671920] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ff312212-f36a-4709-be0d-f677ea666fd5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1573.677159] env[60764]: DEBUG oslo_vmware.api [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Waiting for the task: (returnval){ [ 1573.677159] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5282b094-da96-0d9d-5212-2ea6b7c9a9c3" [ 1573.677159] env[60764]: _type = "Task" [ 1573.677159] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1573.684136] env[60764]: DEBUG oslo_vmware.api [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5282b094-da96-0d9d-5212-2ea6b7c9a9c3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1573.749204] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1573.749436] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1573.749615] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Deleting the datastore file [datastore2] f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1573.749896] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-aaef7bf0-4087-4b1f-9b49-f02f4770543c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1573.757106] env[60764]: DEBUG oslo_vmware.api [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Waiting for the task: (returnval){ [ 1573.757106] env[60764]: value = "task-2205009" [ 1573.757106] env[60764]: _type = "Task" [ 1573.757106] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1573.765568] env[60764]: DEBUG oslo_vmware.api [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Task: {'id': task-2205009, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1574.187550] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1574.187965] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Creating directory with path [datastore2] vmware_temp/accc6233-2436-430e-a8ff-cf2f3ab9c56c/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1574.187965] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f229fd06-1524-426a-b06c-e08337e47e2d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1574.199135] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Created directory with path [datastore2] vmware_temp/accc6233-2436-430e-a8ff-cf2f3ab9c56c/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1574.199372] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Fetch image to [datastore2] vmware_temp/accc6233-2436-430e-a8ff-cf2f3ab9c56c/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1574.199590] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/accc6233-2436-430e-a8ff-cf2f3ab9c56c/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1574.200314] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2a65007-c946-4d0c-b0e8-54ff972b470c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1574.206809] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7aceb17c-79bb-4971-a729-e306996f7ac3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1574.215698] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd740023-0feb-4df4-ace5-f8109571afeb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1574.246214] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4ed2e3f-5270-4ab3-8056-a98adba106ef {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1574.251311] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c27a6b70-6e5d-4890-9557-34facdf725b5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1574.265257] env[60764]: DEBUG oslo_vmware.api [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Task: {'id': task-2205009, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069534} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1574.265481] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1574.265651] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1574.265820] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1574.265985] env[60764]: INFO nova.compute.manager [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1574.268145] env[60764]: DEBUG nova.compute.claims [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1574.268318] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1574.268533] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1574.275340] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1574.432651] env[60764]: DEBUG oslo_vmware.rw_handles [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/accc6233-2436-430e-a8ff-cf2f3ab9c56c/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1574.494403] env[60764]: DEBUG oslo_vmware.rw_handles [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1574.494643] env[60764]: DEBUG oslo_vmware.rw_handles [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/accc6233-2436-430e-a8ff-cf2f3ab9c56c/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1574.557987] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4ad368b-cdf4-4edc-be3a-864e92d16b76 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1574.565472] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0a6d025-ca9c-463d-b1b5-84ce92cd4b15 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1574.594467] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8ccfa0d-89ec-474f-81af-c0557d94922a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1574.601150] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7132dc03-f7d1-4d61-add5-56582945e249 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1574.613626] env[60764]: DEBUG nova.compute.provider_tree [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1574.621780] env[60764]: DEBUG nova.scheduler.client.report [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1574.636658] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.367s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1574.636887] env[60764]: ERROR nova.compute.manager [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1574.636887] env[60764]: Faults: ['InvalidArgument'] [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Traceback (most recent call last): [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] self.driver.spawn(context, instance, image_meta, [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] self._fetch_image_if_missing(context, vi) [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] image_cache(vi, tmp_image_ds_loc) [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] vm_util.copy_virtual_disk( [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] session._wait_for_task(vmdk_copy_task) [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] return self.wait_for_task(task_ref) [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] return evt.wait() [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] result = hub.switch() [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] return self.greenlet.switch() [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] self.f(*self.args, **self.kw) [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] raise exceptions.translate_fault(task_info.error) [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Faults: ['InvalidArgument'] [ 1574.636887] env[60764]: ERROR nova.compute.manager [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] [ 1574.638561] env[60764]: DEBUG nova.compute.utils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1574.639818] env[60764]: DEBUG nova.compute.manager [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Build of instance f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de was re-scheduled: A specified parameter was not correct: fileType [ 1574.639818] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1574.640275] env[60764]: DEBUG nova.compute.manager [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1574.640488] env[60764]: DEBUG nova.compute.manager [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1574.640696] env[60764]: DEBUG nova.compute.manager [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1574.640902] env[60764]: DEBUG nova.network.neutron [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1575.131584] env[60764]: DEBUG nova.network.neutron [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1575.133750] env[60764]: INFO nova.compute.manager [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Took 0.49 seconds to deallocate network for instance. [ 1575.223470] env[60764]: INFO nova.scheduler.client.report [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Deleted allocations for instance f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de [ 1575.243551] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1d7a925b-b4aa-4bb8-b28a-fbfdd1fcd96b tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Lock "f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 633.168s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1575.244703] env[60764]: DEBUG oslo_concurrency.lockutils [None req-30c28746-ca62-4bb1-a984-9725b79d22f4 tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Lock "f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 437.335s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1575.244979] env[60764]: DEBUG oslo_concurrency.lockutils [None req-30c28746-ca62-4bb1-a984-9725b79d22f4 tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Acquiring lock "f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1575.245190] env[60764]: DEBUG oslo_concurrency.lockutils [None req-30c28746-ca62-4bb1-a984-9725b79d22f4 tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Lock "f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1575.245385] env[60764]: DEBUG oslo_concurrency.lockutils [None req-30c28746-ca62-4bb1-a984-9725b79d22f4 tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Lock "f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1575.247522] env[60764]: INFO nova.compute.manager [None req-30c28746-ca62-4bb1-a984-9725b79d22f4 tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Terminating instance [ 1575.249318] env[60764]: DEBUG nova.compute.manager [None req-30c28746-ca62-4bb1-a984-9725b79d22f4 tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1575.249548] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-30c28746-ca62-4bb1-a984-9725b79d22f4 tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1575.250202] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-03b13698-a920-4626-8004-8224f20910ef {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1575.262028] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f96d72c1-22c1-474c-becd-43f774e0f0f2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1575.272930] env[60764]: DEBUG nova.compute.manager [None req-a8024a74-db62-4fce-a55d-fe0906c49604 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: 7d4b7608-622c-41c8-9532-e216ed41db91] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1575.292863] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-30c28746-ca62-4bb1-a984-9725b79d22f4 tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de could not be found. [ 1575.293086] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-30c28746-ca62-4bb1-a984-9725b79d22f4 tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1575.293267] env[60764]: INFO nova.compute.manager [None req-30c28746-ca62-4bb1-a984-9725b79d22f4 tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1575.293503] env[60764]: DEBUG oslo.service.loopingcall [None req-30c28746-ca62-4bb1-a984-9725b79d22f4 tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1575.293748] env[60764]: DEBUG nova.compute.manager [-] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1575.293838] env[60764]: DEBUG nova.network.neutron [-] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1575.296710] env[60764]: DEBUG nova.compute.manager [None req-a8024a74-db62-4fce-a55d-fe0906c49604 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: 7d4b7608-622c-41c8-9532-e216ed41db91] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1575.316544] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a8024a74-db62-4fce-a55d-fe0906c49604 tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "7d4b7608-622c-41c8-9532-e216ed41db91" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.847s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1575.318430] env[60764]: DEBUG nova.network.neutron [-] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1575.325376] env[60764]: DEBUG nova.compute.manager [None req-628c6634-eeb2-45a5-b9a9-83dcc0c77749 tempest-ServersNegativeTestMultiTenantJSON-89438538 tempest-ServersNegativeTestMultiTenantJSON-89438538-project-member] [instance: 6af78199-5a15-4ed8-94b3-abc98dbffe37] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1575.328721] env[60764]: INFO nova.compute.manager [-] [instance: f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de] Took 0.03 seconds to deallocate network for instance. [ 1575.346937] env[60764]: DEBUG nova.compute.manager [None req-628c6634-eeb2-45a5-b9a9-83dcc0c77749 tempest-ServersNegativeTestMultiTenantJSON-89438538 tempest-ServersNegativeTestMultiTenantJSON-89438538-project-member] [instance: 6af78199-5a15-4ed8-94b3-abc98dbffe37] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1575.368874] env[60764]: DEBUG oslo_concurrency.lockutils [None req-628c6634-eeb2-45a5-b9a9-83dcc0c77749 tempest-ServersNegativeTestMultiTenantJSON-89438538 tempest-ServersNegativeTestMultiTenantJSON-89438538-project-member] Lock "6af78199-5a15-4ed8-94b3-abc98dbffe37" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.872s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1575.378324] env[60764]: DEBUG nova.compute.manager [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1575.413103] env[60764]: DEBUG oslo_concurrency.lockutils [None req-30c28746-ca62-4bb1-a984-9725b79d22f4 tempest-ServerRescueTestJSONUnderV235-405328586 tempest-ServerRescueTestJSONUnderV235-405328586-project-member] Lock "f3c23e2a-8d8c-4e85-aca5-0673a8c6d4de" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.168s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1575.429105] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1575.429355] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1575.430853] env[60764]: INFO nova.compute.claims [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1575.642398] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-165a0867-0115-4509-8cd1-6edfc8712bae {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1575.652244] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea909210-0d05-4b47-a1a1-2b82ca64f97f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1575.681668] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c5bafc5-c28e-4348-b8c1-8687fb57c9d9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1575.688476] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71cce848-1b25-4bc4-b2cd-6b6e98422629 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1575.700903] env[60764]: DEBUG nova.compute.provider_tree [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1575.708784] env[60764]: DEBUG nova.scheduler.client.report [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1575.721118] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1575.721626] env[60764]: DEBUG nova.compute.manager [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1575.755988] env[60764]: DEBUG nova.compute.utils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1575.757763] env[60764]: DEBUG nova.compute.manager [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1575.757763] env[60764]: DEBUG nova.network.neutron [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1575.769397] env[60764]: DEBUG nova.compute.manager [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1575.835327] env[60764]: DEBUG nova.compute.manager [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1575.855450] env[60764]: DEBUG nova.policy [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '01c3cb99b7c348a89d40159c8f9abc7c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '21e2c0336c1e449e83e3f3cccb876fe2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1575.863359] env[60764]: DEBUG nova.virt.hardware [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1575.863604] env[60764]: DEBUG nova.virt.hardware [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1575.863762] env[60764]: DEBUG nova.virt.hardware [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1575.863941] env[60764]: DEBUG nova.virt.hardware [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1575.864097] env[60764]: DEBUG nova.virt.hardware [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1575.864248] env[60764]: DEBUG nova.virt.hardware [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1575.864457] env[60764]: DEBUG nova.virt.hardware [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1575.864615] env[60764]: DEBUG nova.virt.hardware [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1575.864784] env[60764]: DEBUG nova.virt.hardware [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1575.864948] env[60764]: DEBUG nova.virt.hardware [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1575.865219] env[60764]: DEBUG nova.virt.hardware [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1575.866090] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6df91daf-9471-46f4-8a4b-e4564b871779 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1575.873550] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26d3c20b-4693-4857-af3d-5ac30302f040 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1576.250905] env[60764]: DEBUG nova.network.neutron [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Successfully created port: 0fac313e-bd54-4f68-9475-f4b85ecf20e1 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1576.851762] env[60764]: DEBUG nova.network.neutron [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Successfully updated port: 0fac313e-bd54-4f68-9475-f4b85ecf20e1 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1576.866814] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Acquiring lock "refresh_cache-2ea05216-40c5-4482-a1d8-278f7ea3d28b" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1576.867105] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Acquired lock "refresh_cache-2ea05216-40c5-4482-a1d8-278f7ea3d28b" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1576.867425] env[60764]: DEBUG nova.network.neutron [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1576.908352] env[60764]: DEBUG nova.network.neutron [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1577.114620] env[60764]: DEBUG nova.network.neutron [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Updating instance_info_cache with network_info: [{"id": "0fac313e-bd54-4f68-9475-f4b85ecf20e1", "address": "fa:16:3e:24:56:7b", "network": {"id": "e4e01888-0f0a-479d-8544-2577e16c9dc3", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-836448257-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "21e2c0336c1e449e83e3f3cccb876fe2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4c6a4836-66dc-4e43-982b-f8fcd3f9989a", "external-id": "nsx-vlan-transportzone-635", "segmentation_id": 635, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0fac313e-bd", "ovs_interfaceid": "0fac313e-bd54-4f68-9475-f4b85ecf20e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1577.128842] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Releasing lock "refresh_cache-2ea05216-40c5-4482-a1d8-278f7ea3d28b" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1577.129178] env[60764]: DEBUG nova.compute.manager [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Instance network_info: |[{"id": "0fac313e-bd54-4f68-9475-f4b85ecf20e1", "address": "fa:16:3e:24:56:7b", "network": {"id": "e4e01888-0f0a-479d-8544-2577e16c9dc3", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-836448257-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "21e2c0336c1e449e83e3f3cccb876fe2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4c6a4836-66dc-4e43-982b-f8fcd3f9989a", "external-id": "nsx-vlan-transportzone-635", "segmentation_id": 635, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0fac313e-bd", "ovs_interfaceid": "0fac313e-bd54-4f68-9475-f4b85ecf20e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1577.129574] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:24:56:7b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4c6a4836-66dc-4e43-982b-f8fcd3f9989a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0fac313e-bd54-4f68-9475-f4b85ecf20e1', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1577.137380] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Creating folder: Project (21e2c0336c1e449e83e3f3cccb876fe2). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1577.138051] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-653b3fe4-66db-46f4-b0a2-b55781899203 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1577.148328] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Created folder: Project (21e2c0336c1e449e83e3f3cccb876fe2) in parent group-v449629. [ 1577.148537] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Creating folder: Instances. Parent ref: group-v449729. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1577.148773] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1d5a2512-5bc4-4ce2-8b02-5627b7d35b07 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1577.157803] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Created folder: Instances in parent group-v449729. [ 1577.158051] env[60764]: DEBUG oslo.service.loopingcall [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1577.158243] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1577.158439] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a34fee40-adec-4805-b334-32de4b645e12 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1577.176928] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1577.176928] env[60764]: value = "task-2205012" [ 1577.176928] env[60764]: _type = "Task" [ 1577.176928] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1577.184428] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205012, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1577.254553] env[60764]: DEBUG nova.compute.manager [req-3a998afd-11d7-47cd-88c0-9c6b5def1750 req-f9825d4d-63b9-4f03-a898-3ce25c1bac3f service nova] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Received event network-vif-plugged-0fac313e-bd54-4f68-9475-f4b85ecf20e1 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1577.254879] env[60764]: DEBUG oslo_concurrency.lockutils [req-3a998afd-11d7-47cd-88c0-9c6b5def1750 req-f9825d4d-63b9-4f03-a898-3ce25c1bac3f service nova] Acquiring lock "2ea05216-40c5-4482-a1d8-278f7ea3d28b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1577.255374] env[60764]: DEBUG oslo_concurrency.lockutils [req-3a998afd-11d7-47cd-88c0-9c6b5def1750 req-f9825d4d-63b9-4f03-a898-3ce25c1bac3f service nova] Lock "2ea05216-40c5-4482-a1d8-278f7ea3d28b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1577.255610] env[60764]: DEBUG oslo_concurrency.lockutils [req-3a998afd-11d7-47cd-88c0-9c6b5def1750 req-f9825d4d-63b9-4f03-a898-3ce25c1bac3f service nova] Lock "2ea05216-40c5-4482-a1d8-278f7ea3d28b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1577.255789] env[60764]: DEBUG nova.compute.manager [req-3a998afd-11d7-47cd-88c0-9c6b5def1750 req-f9825d4d-63b9-4f03-a898-3ce25c1bac3f service nova] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] No waiting events found dispatching network-vif-plugged-0fac313e-bd54-4f68-9475-f4b85ecf20e1 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1577.255957] env[60764]: WARNING nova.compute.manager [req-3a998afd-11d7-47cd-88c0-9c6b5def1750 req-f9825d4d-63b9-4f03-a898-3ce25c1bac3f service nova] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Received unexpected event network-vif-plugged-0fac313e-bd54-4f68-9475-f4b85ecf20e1 for instance with vm_state building and task_state spawning. [ 1577.256130] env[60764]: DEBUG nova.compute.manager [req-3a998afd-11d7-47cd-88c0-9c6b5def1750 req-f9825d4d-63b9-4f03-a898-3ce25c1bac3f service nova] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Received event network-changed-0fac313e-bd54-4f68-9475-f4b85ecf20e1 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1577.256285] env[60764]: DEBUG nova.compute.manager [req-3a998afd-11d7-47cd-88c0-9c6b5def1750 req-f9825d4d-63b9-4f03-a898-3ce25c1bac3f service nova] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Refreshing instance network info cache due to event network-changed-0fac313e-bd54-4f68-9475-f4b85ecf20e1. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1577.256568] env[60764]: DEBUG oslo_concurrency.lockutils [req-3a998afd-11d7-47cd-88c0-9c6b5def1750 req-f9825d4d-63b9-4f03-a898-3ce25c1bac3f service nova] Acquiring lock "refresh_cache-2ea05216-40c5-4482-a1d8-278f7ea3d28b" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1577.256631] env[60764]: DEBUG oslo_concurrency.lockutils [req-3a998afd-11d7-47cd-88c0-9c6b5def1750 req-f9825d4d-63b9-4f03-a898-3ce25c1bac3f service nova] Acquired lock "refresh_cache-2ea05216-40c5-4482-a1d8-278f7ea3d28b" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1577.256752] env[60764]: DEBUG nova.network.neutron [req-3a998afd-11d7-47cd-88c0-9c6b5def1750 req-f9825d4d-63b9-4f03-a898-3ce25c1bac3f service nova] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Refreshing network info cache for port 0fac313e-bd54-4f68-9475-f4b85ecf20e1 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1577.546509] env[60764]: DEBUG nova.network.neutron [req-3a998afd-11d7-47cd-88c0-9c6b5def1750 req-f9825d4d-63b9-4f03-a898-3ce25c1bac3f service nova] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Updated VIF entry in instance network info cache for port 0fac313e-bd54-4f68-9475-f4b85ecf20e1. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1577.546878] env[60764]: DEBUG nova.network.neutron [req-3a998afd-11d7-47cd-88c0-9c6b5def1750 req-f9825d4d-63b9-4f03-a898-3ce25c1bac3f service nova] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Updating instance_info_cache with network_info: [{"id": "0fac313e-bd54-4f68-9475-f4b85ecf20e1", "address": "fa:16:3e:24:56:7b", "network": {"id": "e4e01888-0f0a-479d-8544-2577e16c9dc3", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-836448257-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "21e2c0336c1e449e83e3f3cccb876fe2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4c6a4836-66dc-4e43-982b-f8fcd3f9989a", "external-id": "nsx-vlan-transportzone-635", "segmentation_id": 635, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0fac313e-bd", "ovs_interfaceid": "0fac313e-bd54-4f68-9475-f4b85ecf20e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1577.556246] env[60764]: DEBUG oslo_concurrency.lockutils [req-3a998afd-11d7-47cd-88c0-9c6b5def1750 req-f9825d4d-63b9-4f03-a898-3ce25c1bac3f service nova] Releasing lock "refresh_cache-2ea05216-40c5-4482-a1d8-278f7ea3d28b" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1577.686943] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205012, 'name': CreateVM_Task, 'duration_secs': 0.308801} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1577.687136] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1577.687807] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1577.688076] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1577.688428] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1577.688758] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d3697899-c866-4842-aac7-6764c19da085 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1577.693440] env[60764]: DEBUG oslo_vmware.api [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Waiting for the task: (returnval){ [ 1577.693440] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5250efa9-efff-4c5c-78cd-0ac23578c0c8" [ 1577.693440] env[60764]: _type = "Task" [ 1577.693440] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1577.701070] env[60764]: DEBUG oslo_vmware.api [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5250efa9-efff-4c5c-78cd-0ac23578c0c8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1578.203787] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1578.204087] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1578.204315] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1580.829100] env[60764]: DEBUG oslo_concurrency.lockutils [None req-29631b00-0781-404e-8993-5f1a167b65e9 tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Acquiring lock "2ea05216-40c5-4482-a1d8-278f7ea3d28b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1581.330054] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1581.330240] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1581.330349] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1581.352065] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1581.352200] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 6397ff19-1385-4e38-b199-666394582582] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1581.352278] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a3199f59-f827-404e-8272-296129096180] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1581.352402] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1581.352522] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1581.352641] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1581.352758] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1581.352936] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1581.353077] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1581.353196] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1581.353314] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1581.353770] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1584.329867] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1584.341968] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1584.342207] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1584.342374] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1584.342531] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1584.344018] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8a7c27f-e9fc-489b-a562-59f635260df0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1584.352504] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2113a5fd-21a1-4908-bed7-9262e49845af {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1584.366577] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c51ead1-996f-4e8b-8a10-1bca0f07ecf3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1584.372551] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18e4b237-0a5a-441f-ab88-674b9031a55f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1584.400569] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181258MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1584.400721] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1584.400913] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1584.470081] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b3ca6987-3415-4db5-a514-cd66c342eb7f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1584.470257] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6397ff19-1385-4e38-b199-666394582582 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1584.470387] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a3199f59-f827-404e-8272-296129096180 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1584.470508] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c645f7f5-528b-4719-96dd-8e50a46b4261 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1584.470629] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 51512549-4c6e-41d4-98b0-7d1e801a8b69 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1584.470746] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance bf522599-8aa5-411a-96dd-8bd8328d9156 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1584.470912] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1584.470985] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a83f4609-4c09-4056-a840-cd899af93ea3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1584.471097] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e96a6b8e-75b7-4a2f-a838-107603ad8b80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1584.471216] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2ea05216-40c5-4482-a1d8-278f7ea3d28b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1584.481617] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e7742bf9-cd57-4a84-853f-886e5bc5a6b8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1584.491363] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f13299b8-2c86-41a2-b14c-bfd68ab6dd22 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1584.500583] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a6272c75-92de-45a0-8e3e-82e342f0475c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1584.509601] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 73ba3af8-9a29-4c63-9a55-c9879e74239d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1584.519718] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f1940470-82f6-41fb-bd36-96561ad20102 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1584.519942] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1584.520109] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1584.688456] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85e2f629-d5ad-4172-99be-22e74046b089 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1584.697208] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-098b086b-7c2d-46fe-8853-f3980c996c09 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1584.725847] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e3d04b4-6503-4693-88ad-63ec1181b606 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1584.732473] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35eaae14-26b1-4ad8-9b3a-897f672a06e8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1584.744880] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1584.753649] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1584.770993] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1584.771218] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.370s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1585.771486] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1585.771486] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1589.329768] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1590.324744] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1591.326055] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1591.346251] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1591.346409] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1593.331050] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1594.360536] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Acquiring lock "55ca3e89-807f-473c-8b5b-346fc2ea23f8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1594.360817] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "55ca3e89-807f-473c-8b5b-346fc2ea23f8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1622.084934] env[60764]: WARNING oslo_vmware.rw_handles [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1622.084934] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1622.084934] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1622.084934] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1622.084934] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1622.084934] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1622.084934] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1622.084934] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1622.084934] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1622.084934] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1622.084934] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1622.084934] env[60764]: ERROR oslo_vmware.rw_handles [ 1622.085663] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/accc6233-2436-430e-a8ff-cf2f3ab9c56c/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1622.087365] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1622.087620] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Copying Virtual Disk [datastore2] vmware_temp/accc6233-2436-430e-a8ff-cf2f3ab9c56c/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/accc6233-2436-430e-a8ff-cf2f3ab9c56c/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1622.087928] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-27d27149-bae2-4f68-8946-c689a9fd3477 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1622.095432] env[60764]: DEBUG oslo_vmware.api [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Waiting for the task: (returnval){ [ 1622.095432] env[60764]: value = "task-2205013" [ 1622.095432] env[60764]: _type = "Task" [ 1622.095432] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1622.102994] env[60764]: DEBUG oslo_vmware.api [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Task: {'id': task-2205013, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1622.605084] env[60764]: DEBUG oslo_vmware.exceptions [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1622.605368] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1622.605912] env[60764]: ERROR nova.compute.manager [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1622.605912] env[60764]: Faults: ['InvalidArgument'] [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Traceback (most recent call last): [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] yield resources [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] self.driver.spawn(context, instance, image_meta, [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] self._fetch_image_if_missing(context, vi) [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] image_cache(vi, tmp_image_ds_loc) [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] vm_util.copy_virtual_disk( [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] session._wait_for_task(vmdk_copy_task) [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] return self.wait_for_task(task_ref) [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] return evt.wait() [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] result = hub.switch() [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] return self.greenlet.switch() [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] self.f(*self.args, **self.kw) [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] raise exceptions.translate_fault(task_info.error) [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Faults: ['InvalidArgument'] [ 1622.605912] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] [ 1622.607246] env[60764]: INFO nova.compute.manager [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Terminating instance [ 1622.607755] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1622.607962] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1622.608209] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1764b1fb-c3f6-448d-9116-88cce32e9c1a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1622.610300] env[60764]: DEBUG nova.compute.manager [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1622.610489] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1622.611213] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e1785f3-ef25-4d6d-9167-caf2e346cfa1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1622.617874] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1622.618096] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d539941b-c6d0-493c-8bfa-a6ad7e2f7bb5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1622.620057] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1622.620234] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1622.621157] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ee1d6a6f-6d07-4617-a7a0-f639dff54764 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1622.625748] env[60764]: DEBUG oslo_vmware.api [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Waiting for the task: (returnval){ [ 1622.625748] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52bfecfd-0a51-151f-e496-ec207efdc8cf" [ 1622.625748] env[60764]: _type = "Task" [ 1622.625748] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1622.634716] env[60764]: DEBUG oslo_vmware.api [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52bfecfd-0a51-151f-e496-ec207efdc8cf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1622.964253] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1622.964521] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1622.964702] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Deleting the datastore file [datastore2] b3ca6987-3415-4db5-a514-cd66c342eb7f {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1622.964979] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d32329f6-2123-47bd-aed9-11396bb6b4d8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1622.971067] env[60764]: DEBUG oslo_vmware.api [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Waiting for the task: (returnval){ [ 1622.971067] env[60764]: value = "task-2205015" [ 1622.971067] env[60764]: _type = "Task" [ 1622.971067] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1622.978677] env[60764]: DEBUG oslo_vmware.api [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Task: {'id': task-2205015, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1623.136221] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1623.136527] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Creating directory with path [datastore2] vmware_temp/bfecfbee-8373-4196-b20a-59fbb938f218/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1623.136770] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c98e9fe2-7ef2-4155-98a4-c56560401ab6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.148084] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Created directory with path [datastore2] vmware_temp/bfecfbee-8373-4196-b20a-59fbb938f218/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1623.148275] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Fetch image to [datastore2] vmware_temp/bfecfbee-8373-4196-b20a-59fbb938f218/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1623.148445] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/bfecfbee-8373-4196-b20a-59fbb938f218/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1623.149175] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-826b0486-b855-4428-a074-8d702eed42e4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.155420] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a48ebb22-a667-4104-9fcd-6a0ec091e878 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.164061] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3f4e6d8-7c16-44de-ae32-bd43b2fcbd93 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.195748] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b611b208-d6e3-41f3-8356-d7adc86d7026 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.200886] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-80cd1e5f-3757-4f19-a9f2-983d4947a9c3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.220414] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1623.268944] env[60764]: DEBUG oslo_vmware.rw_handles [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/bfecfbee-8373-4196-b20a-59fbb938f218/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1623.328043] env[60764]: DEBUG oslo_vmware.rw_handles [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1623.328254] env[60764]: DEBUG oslo_vmware.rw_handles [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/bfecfbee-8373-4196-b20a-59fbb938f218/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1623.480842] env[60764]: DEBUG oslo_vmware.api [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Task: {'id': task-2205015, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078504} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1623.481152] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1623.481354] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1623.481531] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1623.481703] env[60764]: INFO nova.compute.manager [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Took 0.87 seconds to destroy the instance on the hypervisor. [ 1623.485590] env[60764]: DEBUG nova.compute.claims [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1623.485762] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1623.485973] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1623.688940] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82a83bd2-8cc5-4c87-8897-44cd6815fc54 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.696420] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecf6b507-dd30-4803-9a37-5902702f239c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.725631] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74950932-d77e-4ab4-9692-3a5ff999c0e4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.732202] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c511c08-5205-4361-ac3a-7f2c629bafb7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.744554] env[60764]: DEBUG nova.compute.provider_tree [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1623.753614] env[60764]: DEBUG nova.scheduler.client.report [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1623.767296] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.281s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1623.767839] env[60764]: ERROR nova.compute.manager [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1623.767839] env[60764]: Faults: ['InvalidArgument'] [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Traceback (most recent call last): [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] self.driver.spawn(context, instance, image_meta, [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] self._fetch_image_if_missing(context, vi) [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] image_cache(vi, tmp_image_ds_loc) [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] vm_util.copy_virtual_disk( [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] session._wait_for_task(vmdk_copy_task) [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] return self.wait_for_task(task_ref) [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] return evt.wait() [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] result = hub.switch() [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] return self.greenlet.switch() [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] self.f(*self.args, **self.kw) [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] raise exceptions.translate_fault(task_info.error) [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Faults: ['InvalidArgument'] [ 1623.767839] env[60764]: ERROR nova.compute.manager [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] [ 1623.768695] env[60764]: DEBUG nova.compute.utils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1623.769989] env[60764]: DEBUG nova.compute.manager [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Build of instance b3ca6987-3415-4db5-a514-cd66c342eb7f was re-scheduled: A specified parameter was not correct: fileType [ 1623.769989] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1623.770378] env[60764]: DEBUG nova.compute.manager [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1623.770588] env[60764]: DEBUG nova.compute.manager [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1623.770803] env[60764]: DEBUG nova.compute.manager [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1623.770973] env[60764]: DEBUG nova.network.neutron [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1624.110573] env[60764]: DEBUG nova.network.neutron [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1624.120810] env[60764]: INFO nova.compute.manager [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Took 0.35 seconds to deallocate network for instance. [ 1624.209242] env[60764]: INFO nova.scheduler.client.report [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Deleted allocations for instance b3ca6987-3415-4db5-a514-cd66c342eb7f [ 1624.231287] env[60764]: DEBUG oslo_concurrency.lockutils [None req-1366ee7b-2c33-4a83-b3b5-2097cf54bece tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Lock "b3ca6987-3415-4db5-a514-cd66c342eb7f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 637.253s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1624.232584] env[60764]: DEBUG oslo_concurrency.lockutils [None req-948ef391-7856-4fa4-b494-ffbcfc080c51 tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Lock "b3ca6987-3415-4db5-a514-cd66c342eb7f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 441.304s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1624.232919] env[60764]: DEBUG oslo_concurrency.lockutils [None req-948ef391-7856-4fa4-b494-ffbcfc080c51 tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Acquiring lock "b3ca6987-3415-4db5-a514-cd66c342eb7f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1624.233070] env[60764]: DEBUG oslo_concurrency.lockutils [None req-948ef391-7856-4fa4-b494-ffbcfc080c51 tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Lock "b3ca6987-3415-4db5-a514-cd66c342eb7f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1624.233344] env[60764]: DEBUG oslo_concurrency.lockutils [None req-948ef391-7856-4fa4-b494-ffbcfc080c51 tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Lock "b3ca6987-3415-4db5-a514-cd66c342eb7f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1624.235716] env[60764]: INFO nova.compute.manager [None req-948ef391-7856-4fa4-b494-ffbcfc080c51 tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Terminating instance [ 1624.237193] env[60764]: DEBUG nova.compute.manager [None req-948ef391-7856-4fa4-b494-ffbcfc080c51 tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1624.237387] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-948ef391-7856-4fa4-b494-ffbcfc080c51 tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1624.237848] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5eff8abc-a9d8-40c4-9878-22e65adada10 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1624.247834] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0c6dda0-8a6c-4e7a-b03e-daa80300d6f3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1624.259469] env[60764]: DEBUG nova.compute.manager [None req-22d536c8-b277-4c82-8efc-8c24bb0bb856 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: e7742bf9-cd57-4a84-853f-886e5bc5a6b8] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1624.279188] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-948ef391-7856-4fa4-b494-ffbcfc080c51 tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b3ca6987-3415-4db5-a514-cd66c342eb7f could not be found. [ 1624.279420] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-948ef391-7856-4fa4-b494-ffbcfc080c51 tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1624.279666] env[60764]: INFO nova.compute.manager [None req-948ef391-7856-4fa4-b494-ffbcfc080c51 tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1624.279870] env[60764]: DEBUG oslo.service.loopingcall [None req-948ef391-7856-4fa4-b494-ffbcfc080c51 tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1624.280148] env[60764]: DEBUG nova.compute.manager [-] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1624.280273] env[60764]: DEBUG nova.network.neutron [-] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1624.285773] env[60764]: DEBUG nova.compute.manager [None req-22d536c8-b277-4c82-8efc-8c24bb0bb856 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: e7742bf9-cd57-4a84-853f-886e5bc5a6b8] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1624.302389] env[60764]: DEBUG nova.network.neutron [-] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1624.304993] env[60764]: DEBUG oslo_concurrency.lockutils [None req-22d536c8-b277-4c82-8efc-8c24bb0bb856 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "e7742bf9-cd57-4a84-853f-886e5bc5a6b8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.371s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1624.312031] env[60764]: INFO nova.compute.manager [-] [instance: b3ca6987-3415-4db5-a514-cd66c342eb7f] Took 0.03 seconds to deallocate network for instance. [ 1624.314225] env[60764]: DEBUG nova.compute.manager [None req-61bb0ee9-be2b-4135-b445-5e61e2821973 tempest-AttachVolumeNegativeTest-1914834048 tempest-AttachVolumeNegativeTest-1914834048-project-member] [instance: f13299b8-2c86-41a2-b14c-bfd68ab6dd22] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1624.337899] env[60764]: DEBUG nova.compute.manager [None req-61bb0ee9-be2b-4135-b445-5e61e2821973 tempest-AttachVolumeNegativeTest-1914834048 tempest-AttachVolumeNegativeTest-1914834048-project-member] [instance: f13299b8-2c86-41a2-b14c-bfd68ab6dd22] Instance disappeared before build. {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1624.362707] env[60764]: DEBUG oslo_concurrency.lockutils [None req-61bb0ee9-be2b-4135-b445-5e61e2821973 tempest-AttachVolumeNegativeTest-1914834048 tempest-AttachVolumeNegativeTest-1914834048-project-member] Lock "f13299b8-2c86-41a2-b14c-bfd68ab6dd22" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.146s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1624.372027] env[60764]: DEBUG nova.compute.manager [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1624.423952] env[60764]: DEBUG oslo_concurrency.lockutils [None req-948ef391-7856-4fa4-b494-ffbcfc080c51 tempest-AttachInterfacesTestJSON-1032662537 tempest-AttachInterfacesTestJSON-1032662537-project-member] Lock "b3ca6987-3415-4db5-a514-cd66c342eb7f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.191s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1624.436821] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1624.437111] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1624.438688] env[60764]: INFO nova.compute.claims [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1624.631737] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0861a5ed-a184-4b27-b68c-512422df757c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1624.639691] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7554d85-2c1e-41f3-980f-2a35712c4435 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1624.671950] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d4247f8-fe05-4c58-9793-363388be31f6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1624.679698] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78aa07e2-e6fc-4946-82d7-c069186d33c0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1624.692558] env[60764]: DEBUG nova.compute.provider_tree [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1624.702172] env[60764]: DEBUG nova.scheduler.client.report [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1624.716968] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.280s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1624.717539] env[60764]: DEBUG nova.compute.manager [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1624.756027] env[60764]: DEBUG nova.compute.utils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1624.757149] env[60764]: DEBUG nova.compute.manager [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1624.757325] env[60764]: DEBUG nova.network.neutron [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1624.766026] env[60764]: DEBUG nova.compute.manager [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1624.816500] env[60764]: DEBUG nova.policy [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8165c7e326c4016a42ba39f68abfce6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5ed1c9589f44a86909b417fac99dab5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1624.825578] env[60764]: DEBUG nova.compute.manager [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1624.853986] env[60764]: DEBUG nova.virt.hardware [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1624.853986] env[60764]: DEBUG nova.virt.hardware [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1624.853986] env[60764]: DEBUG nova.virt.hardware [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1624.853986] env[60764]: DEBUG nova.virt.hardware [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1624.853986] env[60764]: DEBUG nova.virt.hardware [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1624.853986] env[60764]: DEBUG nova.virt.hardware [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1624.854344] env[60764]: DEBUG nova.virt.hardware [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1624.854407] env[60764]: DEBUG nova.virt.hardware [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1624.854573] env[60764]: DEBUG nova.virt.hardware [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1624.854734] env[60764]: DEBUG nova.virt.hardware [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1624.854902] env[60764]: DEBUG nova.virt.hardware [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1624.855787] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe2387f3-cd9b-4f4e-8370-3c6fb2e9f398 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1624.863374] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7b99ded-faf8-4a8e-922f-8c194241c902 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1625.131245] env[60764]: DEBUG nova.network.neutron [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Successfully created port: bd88d4da-d855-4449-87ae-ab4d95b03bec {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1625.940589] env[60764]: DEBUG nova.network.neutron [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Successfully updated port: bd88d4da-d855-4449-87ae-ab4d95b03bec {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1625.953804] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "refresh_cache-a6272c75-92de-45a0-8e3e-82e342f0475c" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1625.954075] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquired lock "refresh_cache-a6272c75-92de-45a0-8e3e-82e342f0475c" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1625.954345] env[60764]: DEBUG nova.network.neutron [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1625.991567] env[60764]: DEBUG nova.network.neutron [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1626.152250] env[60764]: DEBUG nova.compute.manager [req-075f6cd1-6844-4019-ae2f-3e2f70be9605 req-36245a28-c46a-4fd5-91f7-c6f81a7d7df5 service nova] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Received event network-vif-plugged-bd88d4da-d855-4449-87ae-ab4d95b03bec {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1626.152465] env[60764]: DEBUG oslo_concurrency.lockutils [req-075f6cd1-6844-4019-ae2f-3e2f70be9605 req-36245a28-c46a-4fd5-91f7-c6f81a7d7df5 service nova] Acquiring lock "a6272c75-92de-45a0-8e3e-82e342f0475c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1626.152688] env[60764]: DEBUG oslo_concurrency.lockutils [req-075f6cd1-6844-4019-ae2f-3e2f70be9605 req-36245a28-c46a-4fd5-91f7-c6f81a7d7df5 service nova] Lock "a6272c75-92de-45a0-8e3e-82e342f0475c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1626.152822] env[60764]: DEBUG oslo_concurrency.lockutils [req-075f6cd1-6844-4019-ae2f-3e2f70be9605 req-36245a28-c46a-4fd5-91f7-c6f81a7d7df5 service nova] Lock "a6272c75-92de-45a0-8e3e-82e342f0475c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1626.153046] env[60764]: DEBUG nova.compute.manager [req-075f6cd1-6844-4019-ae2f-3e2f70be9605 req-36245a28-c46a-4fd5-91f7-c6f81a7d7df5 service nova] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] No waiting events found dispatching network-vif-plugged-bd88d4da-d855-4449-87ae-ab4d95b03bec {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1626.153153] env[60764]: WARNING nova.compute.manager [req-075f6cd1-6844-4019-ae2f-3e2f70be9605 req-36245a28-c46a-4fd5-91f7-c6f81a7d7df5 service nova] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Received unexpected event network-vif-plugged-bd88d4da-d855-4449-87ae-ab4d95b03bec for instance with vm_state building and task_state spawning. [ 1626.153305] env[60764]: DEBUG nova.compute.manager [req-075f6cd1-6844-4019-ae2f-3e2f70be9605 req-36245a28-c46a-4fd5-91f7-c6f81a7d7df5 service nova] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Received event network-changed-bd88d4da-d855-4449-87ae-ab4d95b03bec {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1626.153462] env[60764]: DEBUG nova.compute.manager [req-075f6cd1-6844-4019-ae2f-3e2f70be9605 req-36245a28-c46a-4fd5-91f7-c6f81a7d7df5 service nova] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Refreshing instance network info cache due to event network-changed-bd88d4da-d855-4449-87ae-ab4d95b03bec. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1626.153623] env[60764]: DEBUG oslo_concurrency.lockutils [req-075f6cd1-6844-4019-ae2f-3e2f70be9605 req-36245a28-c46a-4fd5-91f7-c6f81a7d7df5 service nova] Acquiring lock "refresh_cache-a6272c75-92de-45a0-8e3e-82e342f0475c" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1626.223663] env[60764]: DEBUG nova.network.neutron [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Updating instance_info_cache with network_info: [{"id": "bd88d4da-d855-4449-87ae-ab4d95b03bec", "address": "fa:16:3e:f4:c2:d0", "network": {"id": "6fca304c-0605-4df1-816c-6a41d3a44163", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1810377619-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b5ed1c9589f44a86909b417fac99dab5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "32463b6d-4569-4755-8a29-873a028690a7", "external-id": "nsx-vlan-transportzone-349", "segmentation_id": 349, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbd88d4da-d8", "ovs_interfaceid": "bd88d4da-d855-4449-87ae-ab4d95b03bec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1626.237546] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Releasing lock "refresh_cache-a6272c75-92de-45a0-8e3e-82e342f0475c" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1626.237867] env[60764]: DEBUG nova.compute.manager [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Instance network_info: |[{"id": "bd88d4da-d855-4449-87ae-ab4d95b03bec", "address": "fa:16:3e:f4:c2:d0", "network": {"id": "6fca304c-0605-4df1-816c-6a41d3a44163", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1810377619-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b5ed1c9589f44a86909b417fac99dab5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "32463b6d-4569-4755-8a29-873a028690a7", "external-id": "nsx-vlan-transportzone-349", "segmentation_id": 349, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbd88d4da-d8", "ovs_interfaceid": "bd88d4da-d855-4449-87ae-ab4d95b03bec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1626.238191] env[60764]: DEBUG oslo_concurrency.lockutils [req-075f6cd1-6844-4019-ae2f-3e2f70be9605 req-36245a28-c46a-4fd5-91f7-c6f81a7d7df5 service nova] Acquired lock "refresh_cache-a6272c75-92de-45a0-8e3e-82e342f0475c" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1626.238372] env[60764]: DEBUG nova.network.neutron [req-075f6cd1-6844-4019-ae2f-3e2f70be9605 req-36245a28-c46a-4fd5-91f7-c6f81a7d7df5 service nova] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Refreshing network info cache for port bd88d4da-d855-4449-87ae-ab4d95b03bec {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1626.239418] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f4:c2:d0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '32463b6d-4569-4755-8a29-873a028690a7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'bd88d4da-d855-4449-87ae-ab4d95b03bec', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1626.247452] env[60764]: DEBUG oslo.service.loopingcall [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1626.248073] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1626.248308] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-86b4cb9f-1385-40f3-b9b8-3490c09b309e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1626.271550] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1626.271550] env[60764]: value = "task-2205016" [ 1626.271550] env[60764]: _type = "Task" [ 1626.271550] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1626.279364] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205016, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1626.562825] env[60764]: DEBUG nova.network.neutron [req-075f6cd1-6844-4019-ae2f-3e2f70be9605 req-36245a28-c46a-4fd5-91f7-c6f81a7d7df5 service nova] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Updated VIF entry in instance network info cache for port bd88d4da-d855-4449-87ae-ab4d95b03bec. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1626.563206] env[60764]: DEBUG nova.network.neutron [req-075f6cd1-6844-4019-ae2f-3e2f70be9605 req-36245a28-c46a-4fd5-91f7-c6f81a7d7df5 service nova] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Updating instance_info_cache with network_info: [{"id": "bd88d4da-d855-4449-87ae-ab4d95b03bec", "address": "fa:16:3e:f4:c2:d0", "network": {"id": "6fca304c-0605-4df1-816c-6a41d3a44163", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1810377619-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b5ed1c9589f44a86909b417fac99dab5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "32463b6d-4569-4755-8a29-873a028690a7", "external-id": "nsx-vlan-transportzone-349", "segmentation_id": 349, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbd88d4da-d8", "ovs_interfaceid": "bd88d4da-d855-4449-87ae-ab4d95b03bec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1626.573206] env[60764]: DEBUG oslo_concurrency.lockutils [req-075f6cd1-6844-4019-ae2f-3e2f70be9605 req-36245a28-c46a-4fd5-91f7-c6f81a7d7df5 service nova] Releasing lock "refresh_cache-a6272c75-92de-45a0-8e3e-82e342f0475c" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1626.781317] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205016, 'name': CreateVM_Task, 'duration_secs': 0.311299} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1626.781482] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1626.782189] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1626.782365] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1626.782674] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1626.782923] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a033c703-daf5-4e34-9ef2-feb9ba6155b4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1626.786997] env[60764]: DEBUG oslo_vmware.api [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for the task: (returnval){ [ 1626.786997] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52d4058a-2b25-aec3-c3a4-5fe3060f2d8e" [ 1626.786997] env[60764]: _type = "Task" [ 1626.786997] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1626.794132] env[60764]: DEBUG oslo_vmware.api [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52d4058a-2b25-aec3-c3a4-5fe3060f2d8e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1627.297921] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1627.298275] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1627.298345] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1642.330739] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1642.331725] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1642.331725] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1642.354200] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 6397ff19-1385-4e38-b199-666394582582] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1642.354373] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a3199f59-f827-404e-8272-296129096180] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1642.354526] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1642.354665] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1642.354792] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1642.354928] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1642.355063] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1642.355188] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1642.355594] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1642.355594] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1642.355594] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1643.330323] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1644.330975] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1644.343012] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1644.343271] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1644.343449] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1644.343607] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1644.344766] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-094df71a-6d06-4207-a238-fe923d56b7f2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1644.353453] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a99b619f-78cc-4bdf-b43b-c064c7e80cd5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1644.367075] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d4e7302-8b8e-4aac-9838-7fb3369891a9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1644.372936] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0dede325-3d98-4db0-b053-20697ec92d52 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1644.402267] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181229MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1644.402411] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1644.402594] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1644.475506] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 6397ff19-1385-4e38-b199-666394582582 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1644.475699] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a3199f59-f827-404e-8272-296129096180 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1644.475808] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c645f7f5-528b-4719-96dd-8e50a46b4261 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1644.475940] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 51512549-4c6e-41d4-98b0-7d1e801a8b69 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1644.476095] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance bf522599-8aa5-411a-96dd-8bd8328d9156 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1644.476197] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1644.476314] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a83f4609-4c09-4056-a840-cd899af93ea3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1644.476430] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e96a6b8e-75b7-4a2f-a838-107603ad8b80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1644.476546] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2ea05216-40c5-4482-a1d8-278f7ea3d28b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1644.476660] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a6272c75-92de-45a0-8e3e-82e342f0475c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1644.487178] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 73ba3af8-9a29-4c63-9a55-c9879e74239d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1644.496915] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f1940470-82f6-41fb-bd36-96561ad20102 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1644.505858] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 55ca3e89-807f-473c-8b5b-346fc2ea23f8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1644.506094] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1644.506248] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1644.655870] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edd003c9-5b40-46b4-8fc5-2c6902a8e17d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1644.663717] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3b6a397-f54a-4ef8-a59d-e56425af794a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1644.693752] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5e25ca9-f599-40b0-a23d-ea0f710aabcf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1644.700793] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccee0891-1915-4338-b36e-fa9d4545e183 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1644.714490] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1644.722024] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1644.734334] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1644.734513] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.332s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1646.734624] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1646.735029] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1651.330308] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1652.325907] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1652.329656] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1652.329656] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1654.331857] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1672.099186] env[60764]: WARNING oslo_vmware.rw_handles [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1672.099186] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1672.099186] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1672.099186] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1672.099186] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1672.099186] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1672.099186] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1672.099186] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1672.099186] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1672.099186] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1672.099186] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1672.099186] env[60764]: ERROR oslo_vmware.rw_handles [ 1672.099973] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/bfecfbee-8373-4196-b20a-59fbb938f218/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1672.101619] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1672.101834] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Copying Virtual Disk [datastore2] vmware_temp/bfecfbee-8373-4196-b20a-59fbb938f218/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/bfecfbee-8373-4196-b20a-59fbb938f218/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1672.102143] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7b2bce94-19e6-451c-a3c3-e42eac49f5f1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1672.109460] env[60764]: DEBUG oslo_vmware.api [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Waiting for the task: (returnval){ [ 1672.109460] env[60764]: value = "task-2205017" [ 1672.109460] env[60764]: _type = "Task" [ 1672.109460] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1672.117679] env[60764]: DEBUG oslo_vmware.api [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Task: {'id': task-2205017, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1672.619610] env[60764]: DEBUG oslo_vmware.exceptions [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1672.619904] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1672.620498] env[60764]: ERROR nova.compute.manager [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1672.620498] env[60764]: Faults: ['InvalidArgument'] [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] Traceback (most recent call last): [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] yield resources [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] self.driver.spawn(context, instance, image_meta, [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] self._fetch_image_if_missing(context, vi) [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] image_cache(vi, tmp_image_ds_loc) [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] vm_util.copy_virtual_disk( [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] session._wait_for_task(vmdk_copy_task) [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] return self.wait_for_task(task_ref) [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] return evt.wait() [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] result = hub.switch() [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] return self.greenlet.switch() [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] self.f(*self.args, **self.kw) [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] raise exceptions.translate_fault(task_info.error) [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] Faults: ['InvalidArgument'] [ 1672.620498] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] [ 1672.621603] env[60764]: INFO nova.compute.manager [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Terminating instance [ 1672.622358] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1672.622598] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1672.622808] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a3f87abd-b6e9-4fba-9514-59ce7530e0b1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1672.624989] env[60764]: DEBUG nova.compute.manager [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1672.625200] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1672.625957] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2f57061-9d5d-4f1c-8b5c-140e938bb06b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1672.632808] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1672.633066] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e57bfe39-edd1-4f14-88b3-b97c43f4a4ca {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1672.635168] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1672.635356] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1672.636290] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f22e9bc8-59c3-43a8-84ee-b7c4bafeb955 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1672.640979] env[60764]: DEBUG oslo_vmware.api [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for the task: (returnval){ [ 1672.640979] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]523abcce-bfc2-e044-0c3d-1f0d8ebc0ea3" [ 1672.640979] env[60764]: _type = "Task" [ 1672.640979] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1672.652223] env[60764]: DEBUG oslo_vmware.api [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]523abcce-bfc2-e044-0c3d-1f0d8ebc0ea3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1672.702022] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1672.702022] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1672.702022] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Deleting the datastore file [datastore2] 6397ff19-1385-4e38-b199-666394582582 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1672.702022] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7c052528-db24-4f0a-a2b2-66bc69c7be92 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1672.708382] env[60764]: DEBUG oslo_vmware.api [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Waiting for the task: (returnval){ [ 1672.708382] env[60764]: value = "task-2205019" [ 1672.708382] env[60764]: _type = "Task" [ 1672.708382] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1672.716129] env[60764]: DEBUG oslo_vmware.api [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Task: {'id': task-2205019, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1673.151599] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1673.151958] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Creating directory with path [datastore2] vmware_temp/a5764865-9761-4c63-ba56-9b96c79dd411/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1673.152138] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-90c36492-0bc4-41c8-b98a-6b09ec102114 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1673.163411] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Created directory with path [datastore2] vmware_temp/a5764865-9761-4c63-ba56-9b96c79dd411/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1673.163601] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Fetch image to [datastore2] vmware_temp/a5764865-9761-4c63-ba56-9b96c79dd411/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1673.163767] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/a5764865-9761-4c63-ba56-9b96c79dd411/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1673.164509] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ce19a18-1e52-4c02-a29b-183ec2fcae99 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1673.171107] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ad563d4-cf1f-4041-904b-ead99bb46a85 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1673.182253] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f24492c-a1ac-4c7a-8583-4875370d316a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1673.215064] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f12289bf-fe93-42fd-b9d0-8d15ab6082c6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1673.224060] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1ba28d46-6ddb-47c3-838a-9b6a09c296fe {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1673.225730] env[60764]: DEBUG oslo_vmware.api [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Task: {'id': task-2205019, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07679} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1673.225967] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1673.226161] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1673.226361] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1673.226507] env[60764]: INFO nova.compute.manager [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1673.228546] env[60764]: DEBUG nova.compute.claims [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1673.228729] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1673.228965] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1673.247381] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1673.300437] env[60764]: DEBUG oslo_vmware.rw_handles [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a5764865-9761-4c63-ba56-9b96c79dd411/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1673.359901] env[60764]: DEBUG oslo_vmware.rw_handles [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1673.360104] env[60764]: DEBUG oslo_vmware.rw_handles [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a5764865-9761-4c63-ba56-9b96c79dd411/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1673.473653] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e8b2bab-e3e4-43c3-a256-f0b6cda9c011 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1673.482044] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-321d2090-afcb-4983-a5f9-4bccfe82ae67 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1673.512268] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e9147cf-ca13-42ab-b875-1e1140271a49 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1673.519533] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cce7025a-5d42-4212-8768-c665017e966d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1673.532507] env[60764]: DEBUG nova.compute.provider_tree [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1673.541052] env[60764]: DEBUG nova.scheduler.client.report [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1673.556256] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.327s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1673.556800] env[60764]: ERROR nova.compute.manager [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1673.556800] env[60764]: Faults: ['InvalidArgument'] [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] Traceback (most recent call last): [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] self.driver.spawn(context, instance, image_meta, [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] self._fetch_image_if_missing(context, vi) [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] image_cache(vi, tmp_image_ds_loc) [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] vm_util.copy_virtual_disk( [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] session._wait_for_task(vmdk_copy_task) [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] return self.wait_for_task(task_ref) [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] return evt.wait() [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] result = hub.switch() [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] return self.greenlet.switch() [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] self.f(*self.args, **self.kw) [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] raise exceptions.translate_fault(task_info.error) [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] Faults: ['InvalidArgument'] [ 1673.556800] env[60764]: ERROR nova.compute.manager [instance: 6397ff19-1385-4e38-b199-666394582582] [ 1673.558122] env[60764]: DEBUG nova.compute.utils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1673.560303] env[60764]: DEBUG nova.compute.manager [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Build of instance 6397ff19-1385-4e38-b199-666394582582 was re-scheduled: A specified parameter was not correct: fileType [ 1673.560303] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1673.560679] env[60764]: DEBUG nova.compute.manager [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1673.560864] env[60764]: DEBUG nova.compute.manager [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1673.561051] env[60764]: DEBUG nova.compute.manager [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1673.561222] env[60764]: DEBUG nova.network.neutron [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1673.887914] env[60764]: DEBUG nova.network.neutron [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1673.902285] env[60764]: INFO nova.compute.manager [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Took 0.34 seconds to deallocate network for instance. [ 1673.989747] env[60764]: INFO nova.scheduler.client.report [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Deleted allocations for instance 6397ff19-1385-4e38-b199-666394582582 [ 1674.011425] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d7e2c274-b965-442e-94b6-8ca239428efb tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Lock "6397ff19-1385-4e38-b199-666394582582" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 590.914s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1674.012643] env[60764]: DEBUG oslo_concurrency.lockutils [None req-85dd887a-e89f-4571-975b-c28e4ea92f37 tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Lock "6397ff19-1385-4e38-b199-666394582582" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 394.352s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1674.012885] env[60764]: DEBUG oslo_concurrency.lockutils [None req-85dd887a-e89f-4571-975b-c28e4ea92f37 tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Acquiring lock "6397ff19-1385-4e38-b199-666394582582-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1674.013096] env[60764]: DEBUG oslo_concurrency.lockutils [None req-85dd887a-e89f-4571-975b-c28e4ea92f37 tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Lock "6397ff19-1385-4e38-b199-666394582582-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1674.013268] env[60764]: DEBUG oslo_concurrency.lockutils [None req-85dd887a-e89f-4571-975b-c28e4ea92f37 tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Lock "6397ff19-1385-4e38-b199-666394582582-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1674.015201] env[60764]: INFO nova.compute.manager [None req-85dd887a-e89f-4571-975b-c28e4ea92f37 tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Terminating instance [ 1674.017010] env[60764]: DEBUG nova.compute.manager [None req-85dd887a-e89f-4571-975b-c28e4ea92f37 tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1674.017240] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-85dd887a-e89f-4571-975b-c28e4ea92f37 tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1674.017722] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-567c1271-204d-443e-bd94-4e4ea37e15f2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.022284] env[60764]: DEBUG nova.compute.manager [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1674.028593] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84c902e0-66bd-4b7b-bce0-dc11bb391a51 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.058532] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-85dd887a-e89f-4571-975b-c28e4ea92f37 tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6397ff19-1385-4e38-b199-666394582582 could not be found. [ 1674.058657] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-85dd887a-e89f-4571-975b-c28e4ea92f37 tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1674.058801] env[60764]: INFO nova.compute.manager [None req-85dd887a-e89f-4571-975b-c28e4ea92f37 tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] [instance: 6397ff19-1385-4e38-b199-666394582582] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1674.059058] env[60764]: DEBUG oslo.service.loopingcall [None req-85dd887a-e89f-4571-975b-c28e4ea92f37 tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1674.061283] env[60764]: DEBUG nova.compute.manager [-] [instance: 6397ff19-1385-4e38-b199-666394582582] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1674.061388] env[60764]: DEBUG nova.network.neutron [-] [instance: 6397ff19-1385-4e38-b199-666394582582] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1674.076118] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1674.076350] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1674.077827] env[60764]: INFO nova.compute.claims [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1674.087029] env[60764]: DEBUG nova.network.neutron [-] [instance: 6397ff19-1385-4e38-b199-666394582582] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1674.097771] env[60764]: INFO nova.compute.manager [-] [instance: 6397ff19-1385-4e38-b199-666394582582] Took 0.04 seconds to deallocate network for instance. [ 1674.180784] env[60764]: DEBUG oslo_concurrency.lockutils [None req-85dd887a-e89f-4571-975b-c28e4ea92f37 tempest-ServerMetadataTestJSON-196107804 tempest-ServerMetadataTestJSON-196107804-project-member] Lock "6397ff19-1385-4e38-b199-666394582582" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.168s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1674.265563] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44f74aab-e1fb-4221-8de2-97e51acbb021 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.273979] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4239af19-1026-41ee-8b6b-ddc7cbf4135d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.303090] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36247b1d-2781-4772-b94a-508a9c509561 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.309943] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f17158d7-6271-4d58-af97-ca003ec0a24c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.323010] env[60764]: DEBUG nova.compute.provider_tree [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1674.331479] env[60764]: DEBUG nova.scheduler.client.report [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1674.346209] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1674.346695] env[60764]: DEBUG nova.compute.manager [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1674.382491] env[60764]: DEBUG nova.compute.utils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1674.384667] env[60764]: DEBUG nova.compute.manager [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1674.384962] env[60764]: DEBUG nova.network.neutron [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1674.395933] env[60764]: DEBUG nova.compute.manager [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1674.439554] env[60764]: DEBUG nova.policy [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '72051f4e68b049719e6faf2a31a92561', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8cd94049fe334cddb1283a0046e9ae48', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1674.464804] env[60764]: DEBUG nova.compute.manager [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1674.492395] env[60764]: DEBUG nova.virt.hardware [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1674.492786] env[60764]: DEBUG nova.virt.hardware [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1674.493080] env[60764]: DEBUG nova.virt.hardware [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1674.493954] env[60764]: DEBUG nova.virt.hardware [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1674.493954] env[60764]: DEBUG nova.virt.hardware [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1674.494114] env[60764]: DEBUG nova.virt.hardware [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1674.494669] env[60764]: DEBUG nova.virt.hardware [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1674.494669] env[60764]: DEBUG nova.virt.hardware [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1674.494980] env[60764]: DEBUG nova.virt.hardware [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1674.495296] env[60764]: DEBUG nova.virt.hardware [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1674.495666] env[60764]: DEBUG nova.virt.hardware [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1674.497054] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-708a00d2-f8cc-4536-9c46-eb4811a8e63f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.509452] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8faed861-39c2-4ff6-b289-29cc0c98adc4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.828262] env[60764]: DEBUG nova.network.neutron [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Successfully created port: af26fab1-4670-4c3a-9f32-76e7f4d1e8a7 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1675.619068] env[60764]: DEBUG nova.network.neutron [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Successfully updated port: af26fab1-4670-4c3a-9f32-76e7f4d1e8a7 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1675.635047] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquiring lock "refresh_cache-73ba3af8-9a29-4c63-9a55-c9879e74239d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1675.635047] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquired lock "refresh_cache-73ba3af8-9a29-4c63-9a55-c9879e74239d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1675.635047] env[60764]: DEBUG nova.network.neutron [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1675.679301] env[60764]: DEBUG nova.network.neutron [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1675.879356] env[60764]: DEBUG nova.network.neutron [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Updating instance_info_cache with network_info: [{"id": "af26fab1-4670-4c3a-9f32-76e7f4d1e8a7", "address": "fa:16:3e:e5:77:12", "network": {"id": "90189b06-6aae-49a6-aa89-ec0c32b73181", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1482896272-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8cd94049fe334cddb1283a0046e9ae48", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46785c9c-8b22-487d-a854-b3e67c5ed1d7", "external-id": "nsx-vlan-transportzone-430", "segmentation_id": 430, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaf26fab1-46", "ovs_interfaceid": "af26fab1-4670-4c3a-9f32-76e7f4d1e8a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1675.890943] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Releasing lock "refresh_cache-73ba3af8-9a29-4c63-9a55-c9879e74239d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1675.891293] env[60764]: DEBUG nova.compute.manager [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Instance network_info: |[{"id": "af26fab1-4670-4c3a-9f32-76e7f4d1e8a7", "address": "fa:16:3e:e5:77:12", "network": {"id": "90189b06-6aae-49a6-aa89-ec0c32b73181", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1482896272-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8cd94049fe334cddb1283a0046e9ae48", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46785c9c-8b22-487d-a854-b3e67c5ed1d7", "external-id": "nsx-vlan-transportzone-430", "segmentation_id": 430, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaf26fab1-46", "ovs_interfaceid": "af26fab1-4670-4c3a-9f32-76e7f4d1e8a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1675.891752] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e5:77:12', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '46785c9c-8b22-487d-a854-b3e67c5ed1d7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'af26fab1-4670-4c3a-9f32-76e7f4d1e8a7', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1675.902583] env[60764]: DEBUG oslo.service.loopingcall [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1675.903335] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1675.903633] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e39a2cb2-0d22-41f7-804d-3597a53fdf92 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.926709] env[60764]: DEBUG nova.compute.manager [req-1a807d7c-f0ea-45dc-972b-e0c35cc22e01 req-e0929c32-ef17-4fd4-927f-6cadf1942fc6 service nova] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Received event network-vif-plugged-af26fab1-4670-4c3a-9f32-76e7f4d1e8a7 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1675.926918] env[60764]: DEBUG oslo_concurrency.lockutils [req-1a807d7c-f0ea-45dc-972b-e0c35cc22e01 req-e0929c32-ef17-4fd4-927f-6cadf1942fc6 service nova] Acquiring lock "73ba3af8-9a29-4c63-9a55-c9879e74239d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1675.927111] env[60764]: DEBUG oslo_concurrency.lockutils [req-1a807d7c-f0ea-45dc-972b-e0c35cc22e01 req-e0929c32-ef17-4fd4-927f-6cadf1942fc6 service nova] Lock "73ba3af8-9a29-4c63-9a55-c9879e74239d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1675.927276] env[60764]: DEBUG oslo_concurrency.lockutils [req-1a807d7c-f0ea-45dc-972b-e0c35cc22e01 req-e0929c32-ef17-4fd4-927f-6cadf1942fc6 service nova] Lock "73ba3af8-9a29-4c63-9a55-c9879e74239d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1675.927438] env[60764]: DEBUG nova.compute.manager [req-1a807d7c-f0ea-45dc-972b-e0c35cc22e01 req-e0929c32-ef17-4fd4-927f-6cadf1942fc6 service nova] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] No waiting events found dispatching network-vif-plugged-af26fab1-4670-4c3a-9f32-76e7f4d1e8a7 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1675.927599] env[60764]: WARNING nova.compute.manager [req-1a807d7c-f0ea-45dc-972b-e0c35cc22e01 req-e0929c32-ef17-4fd4-927f-6cadf1942fc6 service nova] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Received unexpected event network-vif-plugged-af26fab1-4670-4c3a-9f32-76e7f4d1e8a7 for instance with vm_state building and task_state spawning. [ 1675.927754] env[60764]: DEBUG nova.compute.manager [req-1a807d7c-f0ea-45dc-972b-e0c35cc22e01 req-e0929c32-ef17-4fd4-927f-6cadf1942fc6 service nova] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Received event network-changed-af26fab1-4670-4c3a-9f32-76e7f4d1e8a7 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1675.927899] env[60764]: DEBUG nova.compute.manager [req-1a807d7c-f0ea-45dc-972b-e0c35cc22e01 req-e0929c32-ef17-4fd4-927f-6cadf1942fc6 service nova] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Refreshing instance network info cache due to event network-changed-af26fab1-4670-4c3a-9f32-76e7f4d1e8a7. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1675.928093] env[60764]: DEBUG oslo_concurrency.lockutils [req-1a807d7c-f0ea-45dc-972b-e0c35cc22e01 req-e0929c32-ef17-4fd4-927f-6cadf1942fc6 service nova] Acquiring lock "refresh_cache-73ba3af8-9a29-4c63-9a55-c9879e74239d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1675.928229] env[60764]: DEBUG oslo_concurrency.lockutils [req-1a807d7c-f0ea-45dc-972b-e0c35cc22e01 req-e0929c32-ef17-4fd4-927f-6cadf1942fc6 service nova] Acquired lock "refresh_cache-73ba3af8-9a29-4c63-9a55-c9879e74239d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1675.928381] env[60764]: DEBUG nova.network.neutron [req-1a807d7c-f0ea-45dc-972b-e0c35cc22e01 req-e0929c32-ef17-4fd4-927f-6cadf1942fc6 service nova] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Refreshing network info cache for port af26fab1-4670-4c3a-9f32-76e7f4d1e8a7 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1675.935886] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1675.935886] env[60764]: value = "task-2205020" [ 1675.935886] env[60764]: _type = "Task" [ 1675.935886] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1675.946932] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205020, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1676.261798] env[60764]: DEBUG nova.network.neutron [req-1a807d7c-f0ea-45dc-972b-e0c35cc22e01 req-e0929c32-ef17-4fd4-927f-6cadf1942fc6 service nova] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Updated VIF entry in instance network info cache for port af26fab1-4670-4c3a-9f32-76e7f4d1e8a7. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1676.262173] env[60764]: DEBUG nova.network.neutron [req-1a807d7c-f0ea-45dc-972b-e0c35cc22e01 req-e0929c32-ef17-4fd4-927f-6cadf1942fc6 service nova] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Updating instance_info_cache with network_info: [{"id": "af26fab1-4670-4c3a-9f32-76e7f4d1e8a7", "address": "fa:16:3e:e5:77:12", "network": {"id": "90189b06-6aae-49a6-aa89-ec0c32b73181", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1482896272-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8cd94049fe334cddb1283a0046e9ae48", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46785c9c-8b22-487d-a854-b3e67c5ed1d7", "external-id": "nsx-vlan-transportzone-430", "segmentation_id": 430, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaf26fab1-46", "ovs_interfaceid": "af26fab1-4670-4c3a-9f32-76e7f4d1e8a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1676.271292] env[60764]: DEBUG oslo_concurrency.lockutils [req-1a807d7c-f0ea-45dc-972b-e0c35cc22e01 req-e0929c32-ef17-4fd4-927f-6cadf1942fc6 service nova] Releasing lock "refresh_cache-73ba3af8-9a29-4c63-9a55-c9879e74239d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1676.446560] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205020, 'name': CreateVM_Task, 'duration_secs': 0.28795} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1676.446742] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1676.447417] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1676.447574] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1676.447886] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1676.448138] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-db228cd3-2e03-40ac-8399-56d2d37d6d07 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1676.453101] env[60764]: DEBUG oslo_vmware.api [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Waiting for the task: (returnval){ [ 1676.453101] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52fa9368-86a3-6a3d-f1a6-5abb63601399" [ 1676.453101] env[60764]: _type = "Task" [ 1676.453101] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1676.461222] env[60764]: DEBUG oslo_vmware.api [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52fa9368-86a3-6a3d-f1a6-5abb63601399, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1676.964726] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1676.965098] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1676.965149] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1677.414764] env[60764]: DEBUG oslo_concurrency.lockutils [None req-fd6c31aa-cced-42ac-aa4f-5d122233eb4d tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "a6272c75-92de-45a0-8e3e-82e342f0475c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1702.330550] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1702.330905] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Cleaning up deleted instances with incomplete migration {{(pid=60764) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1703.222579] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1703.245863] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Getting list of instances from cluster (obj){ [ 1703.245863] env[60764]: value = "domain-c8" [ 1703.245863] env[60764]: _type = "ClusterComputeResource" [ 1703.245863] env[60764]: } {{(pid=60764) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1703.247377] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68f789af-2593-4dfd-abcf-a396e4402059 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1703.264856] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Got total of 10 instances {{(pid=60764) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1703.265051] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid a3199f59-f827-404e-8272-296129096180 {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1703.265250] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid c645f7f5-528b-4719-96dd-8e50a46b4261 {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1703.265407] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid 51512549-4c6e-41d4-98b0-7d1e801a8b69 {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1703.265558] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid bf522599-8aa5-411a-96dd-8bd8328d9156 {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1703.265711] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1703.265856] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid a83f4609-4c09-4056-a840-cd899af93ea3 {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1703.266009] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid e96a6b8e-75b7-4a2f-a838-107603ad8b80 {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1703.266161] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid 2ea05216-40c5-4482-a1d8-278f7ea3d28b {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1703.266338] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid a6272c75-92de-45a0-8e3e-82e342f0475c {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1703.266512] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid 73ba3af8-9a29-4c63-9a55-c9879e74239d {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1703.266840] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "a3199f59-f827-404e-8272-296129096180" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1703.267082] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "c645f7f5-528b-4719-96dd-8e50a46b4261" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1703.267317] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "51512549-4c6e-41d4-98b0-7d1e801a8b69" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1703.267498] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "bf522599-8aa5-411a-96dd-8bd8328d9156" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1703.267731] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1703.267963] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "a83f4609-4c09-4056-a840-cd899af93ea3" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1703.268181] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "e96a6b8e-75b7-4a2f-a838-107603ad8b80" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1703.268381] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "2ea05216-40c5-4482-a1d8-278f7ea3d28b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1703.268608] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "a6272c75-92de-45a0-8e3e-82e342f0475c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1703.268817] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "73ba3af8-9a29-4c63-9a55-c9879e74239d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1704.376728] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1704.376728] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1704.376728] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1704.395710] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a3199f59-f827-404e-8272-296129096180] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1704.395854] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1704.395982] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1704.396290] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1704.396468] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1704.396603] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1704.396792] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1704.396959] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1704.397132] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1704.397288] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1704.397482] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1704.397995] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1706.330387] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1706.330733] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1706.342675] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1706.342889] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1706.343063] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1706.343225] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1706.344336] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b5e6586-7c78-41c1-8840-20507b26c2a6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1706.352946] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bde14d17-e4fc-43c4-88b7-4928246b8b96 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1706.367891] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2450b537-b059-4190-acd2-69f20bba6bcf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1706.374040] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-817bc00b-b3e2-4180-a3e1-54a0e60a2c68 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1706.402310] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181267MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1706.402455] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1706.402643] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1706.534581] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a3199f59-f827-404e-8272-296129096180 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1706.534746] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c645f7f5-528b-4719-96dd-8e50a46b4261 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1706.534878] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 51512549-4c6e-41d4-98b0-7d1e801a8b69 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1706.535011] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance bf522599-8aa5-411a-96dd-8bd8328d9156 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1706.535147] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1706.535268] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a83f4609-4c09-4056-a840-cd899af93ea3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1706.535383] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e96a6b8e-75b7-4a2f-a838-107603ad8b80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1706.535500] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2ea05216-40c5-4482-a1d8-278f7ea3d28b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1706.535614] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a6272c75-92de-45a0-8e3e-82e342f0475c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1706.535727] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 73ba3af8-9a29-4c63-9a55-c9879e74239d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1706.549498] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f1940470-82f6-41fb-bd36-96561ad20102 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1706.560145] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 55ca3e89-807f-473c-8b5b-346fc2ea23f8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1706.560746] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1706.560746] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1706.575975] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Refreshing inventories for resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1706.589515] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Updating ProviderTree inventory for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1706.589688] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Updating inventory in ProviderTree for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1706.599919] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Refreshing aggregate associations for resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4, aggregates: None {{(pid=60764) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1706.616436] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Refreshing trait associations for resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_VMDK {{(pid=60764) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1706.746566] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20f7e971-2916-4e76-a9c1-26908fb995bb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1706.754095] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4c55164-fa9c-42dd-a657-bc9e81e914f2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1706.783910] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a5a346f-1186-4b98-a6fd-fd939d705ce2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1706.790532] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60ab2f29-243c-4e1b-9656-5b028f16e631 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1706.802961] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1706.811687] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1706.824684] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1706.824877] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.422s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1707.331574] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1707.331574] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1707.331574] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Cleaning up deleted instances {{(pid=60764) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1707.339174] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] There are 0 instances to clean {{(pid=60764) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1711.334796] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1711.356799] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1713.330621] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1713.331043] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1714.325896] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1714.329548] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1716.337637] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1718.711931] env[60764]: WARNING oslo_vmware.rw_handles [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1718.711931] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1718.711931] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1718.711931] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1718.711931] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1718.711931] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1718.711931] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1718.711931] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1718.711931] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1718.711931] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1718.711931] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1718.711931] env[60764]: ERROR oslo_vmware.rw_handles [ 1718.712622] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/a5764865-9761-4c63-ba56-9b96c79dd411/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1718.714324] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1718.714571] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Copying Virtual Disk [datastore2] vmware_temp/a5764865-9761-4c63-ba56-9b96c79dd411/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/a5764865-9761-4c63-ba56-9b96c79dd411/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1718.714873] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-90a7a7bc-4367-438f-b706-1db58aba7055 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1718.722338] env[60764]: DEBUG oslo_vmware.api [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for the task: (returnval){ [ 1718.722338] env[60764]: value = "task-2205021" [ 1718.722338] env[60764]: _type = "Task" [ 1718.722338] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1718.730131] env[60764]: DEBUG oslo_vmware.api [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Task: {'id': task-2205021, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1719.233099] env[60764]: DEBUG oslo_vmware.exceptions [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1719.233375] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1719.233922] env[60764]: ERROR nova.compute.manager [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1719.233922] env[60764]: Faults: ['InvalidArgument'] [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] Traceback (most recent call last): [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] yield resources [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] self.driver.spawn(context, instance, image_meta, [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] self._fetch_image_if_missing(context, vi) [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] image_cache(vi, tmp_image_ds_loc) [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] vm_util.copy_virtual_disk( [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] session._wait_for_task(vmdk_copy_task) [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] return self.wait_for_task(task_ref) [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] return evt.wait() [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] result = hub.switch() [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] return self.greenlet.switch() [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] self.f(*self.args, **self.kw) [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] raise exceptions.translate_fault(task_info.error) [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] Faults: ['InvalidArgument'] [ 1719.233922] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] [ 1719.235358] env[60764]: INFO nova.compute.manager [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Terminating instance [ 1719.235758] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1719.235963] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1719.236214] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3bee6d41-240b-4afa-b8d6-aff539975fe0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1719.238548] env[60764]: DEBUG nova.compute.manager [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1719.238740] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1719.239466] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-689bc6c2-3261-4b13-85ea-6f4702fa8e25 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1719.245830] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1719.246047] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-40e4f51a-b37f-46a9-9de9-82f4c820c115 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1719.248110] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1719.248283] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1719.249212] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3c868f44-d95a-4a91-9807-6fe9d4cafe71 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1719.253681] env[60764]: DEBUG oslo_vmware.api [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Waiting for the task: (returnval){ [ 1719.253681] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]527d7e45-44be-e344-0579-867fa6a263b5" [ 1719.253681] env[60764]: _type = "Task" [ 1719.253681] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1719.267607] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1719.267831] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Creating directory with path [datastore2] vmware_temp/09b2b902-15ff-49a7-920a-b91be4c9b2c3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1719.268056] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9c7268ea-c89e-4947-9688-5545fb409bf9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1719.289416] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Created directory with path [datastore2] vmware_temp/09b2b902-15ff-49a7-920a-b91be4c9b2c3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1719.289595] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Fetch image to [datastore2] vmware_temp/09b2b902-15ff-49a7-920a-b91be4c9b2c3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1719.289755] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/09b2b902-15ff-49a7-920a-b91be4c9b2c3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1719.290581] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8da95aaf-df80-47ea-b3d8-6637caae8c29 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1719.298880] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b7b2510-cdc6-47ec-88e0-80b6f9263ab6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1719.308389] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14c21388-4c99-4953-b95a-f1c4fd18d61f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1719.313136] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1719.313335] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1719.313506] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Deleting the datastore file [datastore2] a3199f59-f827-404e-8272-296129096180 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1719.314050] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-235cc7bb-8314-44fa-9ca4-25819bc6801a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1719.342381] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e3bc5f1-3dbe-4ef6-8b89-b99a37849baa {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1719.344935] env[60764]: DEBUG oslo_vmware.api [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for the task: (returnval){ [ 1719.344935] env[60764]: value = "task-2205023" [ 1719.344935] env[60764]: _type = "Task" [ 1719.344935] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1719.349932] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fefa1226-4c8d-4de7-b94e-081540563cb9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1719.354165] env[60764]: DEBUG oslo_vmware.api [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Task: {'id': task-2205023, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1719.375530] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1719.425403] env[60764]: DEBUG oslo_vmware.rw_handles [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/09b2b902-15ff-49a7-920a-b91be4c9b2c3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1719.483706] env[60764]: DEBUG oslo_vmware.rw_handles [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1719.483894] env[60764]: DEBUG oslo_vmware.rw_handles [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/09b2b902-15ff-49a7-920a-b91be4c9b2c3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1719.855272] env[60764]: DEBUG oslo_vmware.api [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Task: {'id': task-2205023, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066425} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1719.855604] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1719.855714] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1719.855834] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1719.856016] env[60764]: INFO nova.compute.manager [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1719.858218] env[60764]: DEBUG nova.compute.claims [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1719.858392] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1719.858607] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1720.052327] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbd6d4b5-64d2-4ff3-a547-eac245f91a33 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1720.059651] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4826adb1-1638-4ebe-a38c-e3c9a77dd8a5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1720.090126] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60845850-7b85-48fc-8eda-406a802ec5dc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1720.097329] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-218e1527-2930-4839-af90-29d53f5c460f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1720.110232] env[60764]: DEBUG nova.compute.provider_tree [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1720.118540] env[60764]: DEBUG nova.scheduler.client.report [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1720.131845] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.273s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1720.133025] env[60764]: ERROR nova.compute.manager [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1720.133025] env[60764]: Faults: ['InvalidArgument'] [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] Traceback (most recent call last): [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] self.driver.spawn(context, instance, image_meta, [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] self._fetch_image_if_missing(context, vi) [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] image_cache(vi, tmp_image_ds_loc) [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] vm_util.copy_virtual_disk( [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] session._wait_for_task(vmdk_copy_task) [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] return self.wait_for_task(task_ref) [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] return evt.wait() [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] result = hub.switch() [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] return self.greenlet.switch() [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] self.f(*self.args, **self.kw) [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] raise exceptions.translate_fault(task_info.error) [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] Faults: ['InvalidArgument'] [ 1720.133025] env[60764]: ERROR nova.compute.manager [instance: a3199f59-f827-404e-8272-296129096180] [ 1720.134547] env[60764]: DEBUG nova.compute.utils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1720.134682] env[60764]: DEBUG nova.compute.manager [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Build of instance a3199f59-f827-404e-8272-296129096180 was re-scheduled: A specified parameter was not correct: fileType [ 1720.134682] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1720.135069] env[60764]: DEBUG nova.compute.manager [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1720.135248] env[60764]: DEBUG nova.compute.manager [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1720.135425] env[60764]: DEBUG nova.compute.manager [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1720.135588] env[60764]: DEBUG nova.network.neutron [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1720.471331] env[60764]: DEBUG nova.network.neutron [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1720.481396] env[60764]: INFO nova.compute.manager [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Took 0.35 seconds to deallocate network for instance. [ 1720.569766] env[60764]: INFO nova.scheduler.client.report [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Deleted allocations for instance a3199f59-f827-404e-8272-296129096180 [ 1720.589253] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dde38235-ebe5-4e9b-aa4f-7165d4f6c790 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "a3199f59-f827-404e-8272-296129096180" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 630.765s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1720.590387] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c4245564-c180-4abd-87bf-dedf5a80c91e tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "a3199f59-f827-404e-8272-296129096180" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 434.557s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1720.590610] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c4245564-c180-4abd-87bf-dedf5a80c91e tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "a3199f59-f827-404e-8272-296129096180-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1720.590814] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c4245564-c180-4abd-87bf-dedf5a80c91e tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "a3199f59-f827-404e-8272-296129096180-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1720.590979] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c4245564-c180-4abd-87bf-dedf5a80c91e tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "a3199f59-f827-404e-8272-296129096180-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1720.592903] env[60764]: INFO nova.compute.manager [None req-c4245564-c180-4abd-87bf-dedf5a80c91e tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Terminating instance [ 1720.594735] env[60764]: DEBUG nova.compute.manager [None req-c4245564-c180-4abd-87bf-dedf5a80c91e tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1720.594814] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-c4245564-c180-4abd-87bf-dedf5a80c91e tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1720.595265] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-40096e27-d184-4926-82eb-1c2bf76ec59d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1720.604131] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7021ea5-332a-492c-b8c6-ddcc2c987755 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1720.614977] env[60764]: DEBUG nova.compute.manager [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1720.635044] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-c4245564-c180-4abd-87bf-dedf5a80c91e tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a3199f59-f827-404e-8272-296129096180 could not be found. [ 1720.635283] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-c4245564-c180-4abd-87bf-dedf5a80c91e tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1720.635482] env[60764]: INFO nova.compute.manager [None req-c4245564-c180-4abd-87bf-dedf5a80c91e tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a3199f59-f827-404e-8272-296129096180] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1720.635742] env[60764]: DEBUG oslo.service.loopingcall [None req-c4245564-c180-4abd-87bf-dedf5a80c91e tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1720.635973] env[60764]: DEBUG nova.compute.manager [-] [instance: a3199f59-f827-404e-8272-296129096180] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1720.636094] env[60764]: DEBUG nova.network.neutron [-] [instance: a3199f59-f827-404e-8272-296129096180] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1720.663848] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1720.664186] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1720.665824] env[60764]: INFO nova.compute.claims [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1720.670428] env[60764]: DEBUG nova.network.neutron [-] [instance: a3199f59-f827-404e-8272-296129096180] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1720.678840] env[60764]: INFO nova.compute.manager [-] [instance: a3199f59-f827-404e-8272-296129096180] Took 0.04 seconds to deallocate network for instance. [ 1720.796090] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c4245564-c180-4abd-87bf-dedf5a80c91e tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "a3199f59-f827-404e-8272-296129096180" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.206s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1720.797150] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "a3199f59-f827-404e-8272-296129096180" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 17.530s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1720.797236] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a3199f59-f827-404e-8272-296129096180] During sync_power_state the instance has a pending task (deleting). Skip. [ 1720.797354] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "a3199f59-f827-404e-8272-296129096180" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1720.872383] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a8b3c7d-2940-405e-9726-ca6ee0f65ab9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1720.880668] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2955b16a-f4b2-4987-a3e3-6d15d77141dd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1720.910118] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31b99a01-df16-4ab5-a210-cd9d70c35b60 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1720.917877] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af9282ff-503b-41f0-b6d0-6bf74e0c2f9a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1720.931068] env[60764]: DEBUG nova.compute.provider_tree [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1720.939958] env[60764]: DEBUG nova.scheduler.client.report [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1720.954840] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.291s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1720.955420] env[60764]: DEBUG nova.compute.manager [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1721.007019] env[60764]: DEBUG nova.compute.utils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1721.008449] env[60764]: DEBUG nova.compute.manager [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1721.008674] env[60764]: DEBUG nova.network.neutron [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1721.016646] env[60764]: DEBUG nova.compute.manager [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1721.065694] env[60764]: DEBUG nova.policy [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fa689965f9734683b98d601d3fccfb91', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c6d47126b96d442eb2b385f25fd25081', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1721.077265] env[60764]: DEBUG nova.compute.manager [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1721.103469] env[60764]: DEBUG nova.virt.hardware [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1721.103904] env[60764]: DEBUG nova.virt.hardware [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1721.104215] env[60764]: DEBUG nova.virt.hardware [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1721.104516] env[60764]: DEBUG nova.virt.hardware [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1721.105185] env[60764]: DEBUG nova.virt.hardware [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1721.105185] env[60764]: DEBUG nova.virt.hardware [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1721.105344] env[60764]: DEBUG nova.virt.hardware [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1721.105389] env[60764]: DEBUG nova.virt.hardware [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1721.105522] env[60764]: DEBUG nova.virt.hardware [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1721.105679] env[60764]: DEBUG nova.virt.hardware [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1721.105848] env[60764]: DEBUG nova.virt.hardware [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1721.106716] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6e38df1-2a46-45a2-bc2b-23485ec69be6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1721.114789] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd59d609-093d-4907-a991-c24c3b21b65c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1721.579798] env[60764]: DEBUG nova.network.neutron [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Successfully created port: 8db9f8dd-6e40-4cd6-a9ee-eb41c8d531b2 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1722.213675] env[60764]: DEBUG nova.network.neutron [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Successfully updated port: 8db9f8dd-6e40-4cd6-a9ee-eb41c8d531b2 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1722.225107] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Acquiring lock "refresh_cache-f1940470-82f6-41fb-bd36-96561ad20102" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1722.225456] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Acquired lock "refresh_cache-f1940470-82f6-41fb-bd36-96561ad20102" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1722.225456] env[60764]: DEBUG nova.network.neutron [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1722.263660] env[60764]: DEBUG nova.network.neutron [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1722.413714] env[60764]: DEBUG nova.network.neutron [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Updating instance_info_cache with network_info: [{"id": "8db9f8dd-6e40-4cd6-a9ee-eb41c8d531b2", "address": "fa:16:3e:d1:84:10", "network": {"id": "fc4c454b-cd25-4a4f-9137-7594e710062e", "bridge": "br-int", "label": "tempest-ServersTestJSON-388324108-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c6d47126b96d442eb2b385f25fd25081", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "456bd8a2-0fb6-4b17-9d25-08e7995c5184", "external-id": "nsx-vlan-transportzone-65", "segmentation_id": 65, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8db9f8dd-6e", "ovs_interfaceid": "8db9f8dd-6e40-4cd6-a9ee-eb41c8d531b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1722.425857] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Releasing lock "refresh_cache-f1940470-82f6-41fb-bd36-96561ad20102" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1722.426157] env[60764]: DEBUG nova.compute.manager [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Instance network_info: |[{"id": "8db9f8dd-6e40-4cd6-a9ee-eb41c8d531b2", "address": "fa:16:3e:d1:84:10", "network": {"id": "fc4c454b-cd25-4a4f-9137-7594e710062e", "bridge": "br-int", "label": "tempest-ServersTestJSON-388324108-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c6d47126b96d442eb2b385f25fd25081", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "456bd8a2-0fb6-4b17-9d25-08e7995c5184", "external-id": "nsx-vlan-transportzone-65", "segmentation_id": 65, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8db9f8dd-6e", "ovs_interfaceid": "8db9f8dd-6e40-4cd6-a9ee-eb41c8d531b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1722.426670] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d1:84:10', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '456bd8a2-0fb6-4b17-9d25-08e7995c5184', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8db9f8dd-6e40-4cd6-a9ee-eb41c8d531b2', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1722.434163] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Creating folder: Project (c6d47126b96d442eb2b385f25fd25081). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1722.434669] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3ed223db-40d1-4c77-aa87-6f48655564ec {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1722.445532] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Created folder: Project (c6d47126b96d442eb2b385f25fd25081) in parent group-v449629. [ 1722.445715] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Creating folder: Instances. Parent ref: group-v449734. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1722.445938] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9decade7-284a-4b47-aa25-9d27966bf483 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1722.454824] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Created folder: Instances in parent group-v449734. [ 1722.455048] env[60764]: DEBUG oslo.service.loopingcall [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1722.455226] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1722.455417] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cfb00223-167e-4e74-a7eb-522b24249ba3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1722.473739] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1722.473739] env[60764]: value = "task-2205026" [ 1722.473739] env[60764]: _type = "Task" [ 1722.473739] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1722.480830] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205026, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1722.504278] env[60764]: DEBUG nova.compute.manager [req-0182229f-c91b-4d78-a8e5-076183f4fe77 req-56a2c1ab-9a1a-4179-a543-4493b6e2a3db service nova] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Received event network-vif-plugged-8db9f8dd-6e40-4cd6-a9ee-eb41c8d531b2 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1722.504464] env[60764]: DEBUG oslo_concurrency.lockutils [req-0182229f-c91b-4d78-a8e5-076183f4fe77 req-56a2c1ab-9a1a-4179-a543-4493b6e2a3db service nova] Acquiring lock "f1940470-82f6-41fb-bd36-96561ad20102-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1722.504599] env[60764]: DEBUG oslo_concurrency.lockutils [req-0182229f-c91b-4d78-a8e5-076183f4fe77 req-56a2c1ab-9a1a-4179-a543-4493b6e2a3db service nova] Lock "f1940470-82f6-41fb-bd36-96561ad20102-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1722.504793] env[60764]: DEBUG oslo_concurrency.lockutils [req-0182229f-c91b-4d78-a8e5-076183f4fe77 req-56a2c1ab-9a1a-4179-a543-4493b6e2a3db service nova] Lock "f1940470-82f6-41fb-bd36-96561ad20102-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1722.504917] env[60764]: DEBUG nova.compute.manager [req-0182229f-c91b-4d78-a8e5-076183f4fe77 req-56a2c1ab-9a1a-4179-a543-4493b6e2a3db service nova] [instance: f1940470-82f6-41fb-bd36-96561ad20102] No waiting events found dispatching network-vif-plugged-8db9f8dd-6e40-4cd6-a9ee-eb41c8d531b2 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1722.505103] env[60764]: WARNING nova.compute.manager [req-0182229f-c91b-4d78-a8e5-076183f4fe77 req-56a2c1ab-9a1a-4179-a543-4493b6e2a3db service nova] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Received unexpected event network-vif-plugged-8db9f8dd-6e40-4cd6-a9ee-eb41c8d531b2 for instance with vm_state building and task_state spawning. [ 1722.505264] env[60764]: DEBUG nova.compute.manager [req-0182229f-c91b-4d78-a8e5-076183f4fe77 req-56a2c1ab-9a1a-4179-a543-4493b6e2a3db service nova] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Received event network-changed-8db9f8dd-6e40-4cd6-a9ee-eb41c8d531b2 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1722.505413] env[60764]: DEBUG nova.compute.manager [req-0182229f-c91b-4d78-a8e5-076183f4fe77 req-56a2c1ab-9a1a-4179-a543-4493b6e2a3db service nova] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Refreshing instance network info cache due to event network-changed-8db9f8dd-6e40-4cd6-a9ee-eb41c8d531b2. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1722.505591] env[60764]: DEBUG oslo_concurrency.lockutils [req-0182229f-c91b-4d78-a8e5-076183f4fe77 req-56a2c1ab-9a1a-4179-a543-4493b6e2a3db service nova] Acquiring lock "refresh_cache-f1940470-82f6-41fb-bd36-96561ad20102" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1722.506167] env[60764]: DEBUG oslo_concurrency.lockutils [req-0182229f-c91b-4d78-a8e5-076183f4fe77 req-56a2c1ab-9a1a-4179-a543-4493b6e2a3db service nova] Acquired lock "refresh_cache-f1940470-82f6-41fb-bd36-96561ad20102" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1722.506167] env[60764]: DEBUG nova.network.neutron [req-0182229f-c91b-4d78-a8e5-076183f4fe77 req-56a2c1ab-9a1a-4179-a543-4493b6e2a3db service nova] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Refreshing network info cache for port 8db9f8dd-6e40-4cd6-a9ee-eb41c8d531b2 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1722.792515] env[60764]: DEBUG nova.network.neutron [req-0182229f-c91b-4d78-a8e5-076183f4fe77 req-56a2c1ab-9a1a-4179-a543-4493b6e2a3db service nova] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Updated VIF entry in instance network info cache for port 8db9f8dd-6e40-4cd6-a9ee-eb41c8d531b2. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1722.792906] env[60764]: DEBUG nova.network.neutron [req-0182229f-c91b-4d78-a8e5-076183f4fe77 req-56a2c1ab-9a1a-4179-a543-4493b6e2a3db service nova] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Updating instance_info_cache with network_info: [{"id": "8db9f8dd-6e40-4cd6-a9ee-eb41c8d531b2", "address": "fa:16:3e:d1:84:10", "network": {"id": "fc4c454b-cd25-4a4f-9137-7594e710062e", "bridge": "br-int", "label": "tempest-ServersTestJSON-388324108-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c6d47126b96d442eb2b385f25fd25081", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "456bd8a2-0fb6-4b17-9d25-08e7995c5184", "external-id": "nsx-vlan-transportzone-65", "segmentation_id": 65, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8db9f8dd-6e", "ovs_interfaceid": "8db9f8dd-6e40-4cd6-a9ee-eb41c8d531b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1722.801851] env[60764]: DEBUG oslo_concurrency.lockutils [req-0182229f-c91b-4d78-a8e5-076183f4fe77 req-56a2c1ab-9a1a-4179-a543-4493b6e2a3db service nova] Releasing lock "refresh_cache-f1940470-82f6-41fb-bd36-96561ad20102" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1722.983667] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205026, 'name': CreateVM_Task, 'duration_secs': 0.288835} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1722.983822] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1722.984489] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1722.984654] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1722.984979] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1722.985241] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-55c6a4c6-d160-4800-8f91-a13c3063376d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1722.989487] env[60764]: DEBUG oslo_vmware.api [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Waiting for the task: (returnval){ [ 1722.989487] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]525dbb90-d55a-5a17-2688-9ec00472cbb4" [ 1722.989487] env[60764]: _type = "Task" [ 1722.989487] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1722.996658] env[60764]: DEBUG oslo_vmware.api [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]525dbb90-d55a-5a17-2688-9ec00472cbb4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1723.502084] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1723.502084] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1723.502084] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1739.793100] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ed3fb308-05a2-4385-9cf6-fe8d88b9ea44 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquiring lock "73ba3af8-9a29-4c63-9a55-c9879e74239d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1750.169182] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "aa9f1e61-ac26-495c-a698-5163661401a5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1750.169561] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "aa9f1e61-ac26-495c-a698-5163661401a5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1765.333796] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1765.334137] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1765.334335] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1765.351526] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3b2edd42-5af3-46f2-8b77-d94ad47f350a tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Acquiring lock "f1940470-82f6-41fb-bd36-96561ad20102" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1765.356776] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1765.356937] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1765.357205] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1765.357311] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1765.357436] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1765.357558] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1765.357675] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1765.357790] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1765.357905] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1765.358082] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1765.358195] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1766.330316] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1767.129621] env[60764]: WARNING oslo_vmware.rw_handles [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1767.129621] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1767.129621] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1767.129621] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1767.129621] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1767.129621] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1767.129621] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1767.129621] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1767.129621] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1767.129621] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1767.129621] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1767.129621] env[60764]: ERROR oslo_vmware.rw_handles [ 1767.130172] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/09b2b902-15ff-49a7-920a-b91be4c9b2c3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1767.132129] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1767.132384] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Copying Virtual Disk [datastore2] vmware_temp/09b2b902-15ff-49a7-920a-b91be4c9b2c3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/09b2b902-15ff-49a7-920a-b91be4c9b2c3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1767.132696] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-839f607f-6fa5-44d4-8f9e-650de4290e32 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1767.140969] env[60764]: DEBUG oslo_vmware.api [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Waiting for the task: (returnval){ [ 1767.140969] env[60764]: value = "task-2205027" [ 1767.140969] env[60764]: _type = "Task" [ 1767.140969] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1767.148569] env[60764]: DEBUG oslo_vmware.api [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Task: {'id': task-2205027, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1767.330435] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1767.341975] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1767.342198] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1767.342369] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1767.342565] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1767.343645] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-404a87a4-7426-4fbe-8743-1b95a1543a85 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1767.352197] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e0c1748-ec0f-4f13-b483-038b5ee119bc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1767.366965] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9215624-2225-4f53-b4ce-70aebb4c5786 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1767.373077] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a936229-2f78-4207-91d1-8a6b2a022e46 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1767.401301] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181262MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1767.401436] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1767.401620] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1767.471454] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c645f7f5-528b-4719-96dd-8e50a46b4261 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1767.471614] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 51512549-4c6e-41d4-98b0-7d1e801a8b69 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1767.471741] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance bf522599-8aa5-411a-96dd-8bd8328d9156 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1767.471862] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1767.471979] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a83f4609-4c09-4056-a840-cd899af93ea3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1767.472114] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e96a6b8e-75b7-4a2f-a838-107603ad8b80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1767.472256] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2ea05216-40c5-4482-a1d8-278f7ea3d28b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1767.472414] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a6272c75-92de-45a0-8e3e-82e342f0475c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1767.472533] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 73ba3af8-9a29-4c63-9a55-c9879e74239d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1767.472645] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f1940470-82f6-41fb-bd36-96561ad20102 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1767.482687] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 55ca3e89-807f-473c-8b5b-346fc2ea23f8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1767.492873] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aa9f1e61-ac26-495c-a698-5163661401a5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1767.493094] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1767.493240] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1767.632767] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f26b7ff-6338-42fb-afbb-722fee46e549 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1767.640489] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c5baa50-4c86-44f4-95c7-f56afc1b9e26 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1767.651281] env[60764]: DEBUG oslo_vmware.exceptions [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1767.675767] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1767.676372] env[60764]: ERROR nova.compute.manager [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1767.676372] env[60764]: Faults: ['InvalidArgument'] [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Traceback (most recent call last): [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] yield resources [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] self.driver.spawn(context, instance, image_meta, [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] self._fetch_image_if_missing(context, vi) [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] image_cache(vi, tmp_image_ds_loc) [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] vm_util.copy_virtual_disk( [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] session._wait_for_task(vmdk_copy_task) [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] return self.wait_for_task(task_ref) [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] return evt.wait() [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] result = hub.switch() [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] return self.greenlet.switch() [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] self.f(*self.args, **self.kw) [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] raise exceptions.translate_fault(task_info.error) [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Faults: ['InvalidArgument'] [ 1767.676372] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] [ 1767.677180] env[60764]: INFO nova.compute.manager [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Terminating instance [ 1767.678439] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1767.678653] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1767.679436] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66f9761b-f90a-403a-9a48-37b76ee2e7a6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1767.682246] env[60764]: DEBUG nova.compute.manager [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1767.682443] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1767.682658] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f007c739-a141-40d7-aec8-5fd7c4e1d9b0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1767.684716] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56dd47c1-30bf-431e-8263-ad103da0acdf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1767.692346] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e0e1fee-d496-4c8a-bc83-0d8a51f7e4ac {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1767.698384] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1767.698571] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1767.699905] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-efa9d169-33d7-4915-818a-7007c4509561 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1767.711266] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1767.711637] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1767.713021] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-bdd87d4d-6456-4eee-8329-6439050bc164 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1767.715594] env[60764]: DEBUG oslo_vmware.api [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Waiting for the task: (returnval){ [ 1767.715594] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]524f1a22-9974-692d-9469-e13b276c0da4" [ 1767.715594] env[60764]: _type = "Task" [ 1767.715594] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1767.721294] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1767.727354] env[60764]: DEBUG oslo_vmware.api [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]524f1a22-9974-692d-9469-e13b276c0da4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1767.734153] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1767.734332] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.333s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1767.783020] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1767.783020] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1767.783020] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Deleting the datastore file [datastore2] c645f7f5-528b-4719-96dd-8e50a46b4261 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1767.783020] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-85c18303-16ee-4ff3-9ee1-0f3dac3f415e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1767.789166] env[60764]: DEBUG oslo_vmware.api [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Waiting for the task: (returnval){ [ 1767.789166] env[60764]: value = "task-2205029" [ 1767.789166] env[60764]: _type = "Task" [ 1767.789166] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1767.797308] env[60764]: DEBUG oslo_vmware.api [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Task: {'id': task-2205029, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1768.225924] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1768.226224] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Creating directory with path [datastore2] vmware_temp/a5d17515-090e-4379-9afa-805d4fee3a50/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1768.226384] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fabe4d87-404b-46df-b739-0a5ba31e8e22 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1768.237251] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Created directory with path [datastore2] vmware_temp/a5d17515-090e-4379-9afa-805d4fee3a50/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1768.237430] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Fetch image to [datastore2] vmware_temp/a5d17515-090e-4379-9afa-805d4fee3a50/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1768.237597] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/a5d17515-090e-4379-9afa-805d4fee3a50/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1768.238367] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48a029e0-73eb-4570-962f-2efe6358d39a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1768.244866] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f7db560-9051-44bb-ae83-97f18fea885b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1768.254038] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fe01ad5-31e0-472e-b764-8b7c65f0ca43 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1768.285220] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f6c3a3c-cf84-4739-b852-fb89a570ee80 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1768.293174] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f91aedf3-06af-4219-9512-27fec6f8b983 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1768.299116] env[60764]: DEBUG oslo_vmware.api [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Task: {'id': task-2205029, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070008} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1768.299380] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1768.299579] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1768.299753] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1768.299922] env[60764]: INFO nova.compute.manager [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1768.302296] env[60764]: DEBUG nova.compute.claims [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1768.302475] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1768.302693] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1768.314757] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1768.362142] env[60764]: DEBUG oslo_vmware.rw_handles [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a5d17515-090e-4379-9afa-805d4fee3a50/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1768.421506] env[60764]: DEBUG oslo_vmware.rw_handles [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1768.421688] env[60764]: DEBUG oslo_vmware.rw_handles [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a5d17515-090e-4379-9afa-805d4fee3a50/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1768.524348] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cddda20e-da6d-4971-9cb0-37c598c7b7c4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1768.530845] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a714ba8e-fcee-48a0-864f-6afdb93b41c0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1768.561066] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f70bcf1-8744-4cce-8a14-24a72674e518 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1768.567843] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f4c1a30-ed15-456d-a65b-bc8307516527 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1768.580850] env[60764]: DEBUG nova.compute.provider_tree [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1768.588971] env[60764]: DEBUG nova.scheduler.client.report [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1768.602905] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.300s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1768.603444] env[60764]: ERROR nova.compute.manager [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1768.603444] env[60764]: Faults: ['InvalidArgument'] [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Traceback (most recent call last): [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] self.driver.spawn(context, instance, image_meta, [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] self._fetch_image_if_missing(context, vi) [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] image_cache(vi, tmp_image_ds_loc) [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] vm_util.copy_virtual_disk( [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] session._wait_for_task(vmdk_copy_task) [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] return self.wait_for_task(task_ref) [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] return evt.wait() [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] result = hub.switch() [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] return self.greenlet.switch() [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] self.f(*self.args, **self.kw) [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] raise exceptions.translate_fault(task_info.error) [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Faults: ['InvalidArgument'] [ 1768.603444] env[60764]: ERROR nova.compute.manager [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] [ 1768.604191] env[60764]: DEBUG nova.compute.utils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1768.605811] env[60764]: DEBUG nova.compute.manager [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Build of instance c645f7f5-528b-4719-96dd-8e50a46b4261 was re-scheduled: A specified parameter was not correct: fileType [ 1768.605811] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1768.606220] env[60764]: DEBUG nova.compute.manager [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1768.606394] env[60764]: DEBUG nova.compute.manager [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1768.606565] env[60764]: DEBUG nova.compute.manager [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1768.606724] env[60764]: DEBUG nova.network.neutron [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1768.733543] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1768.875994] env[60764]: DEBUG nova.network.neutron [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1768.887259] env[60764]: INFO nova.compute.manager [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Took 0.28 seconds to deallocate network for instance. [ 1769.000410] env[60764]: INFO nova.scheduler.client.report [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Deleted allocations for instance c645f7f5-528b-4719-96dd-8e50a46b4261 [ 1769.026353] env[60764]: DEBUG oslo_concurrency.lockutils [None req-047a01a6-65a9-49db-a3bf-2aea60567c71 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "c645f7f5-528b-4719-96dd-8e50a46b4261" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 617.809s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1769.029442] env[60764]: DEBUG oslo_concurrency.lockutils [None req-689ac139-db04-40ab-b8a5-d067500c8d5b tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "c645f7f5-528b-4719-96dd-8e50a46b4261" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 422.690s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1769.029442] env[60764]: DEBUG oslo_concurrency.lockutils [None req-689ac139-db04-40ab-b8a5-d067500c8d5b tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquiring lock "c645f7f5-528b-4719-96dd-8e50a46b4261-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1769.029442] env[60764]: DEBUG oslo_concurrency.lockutils [None req-689ac139-db04-40ab-b8a5-d067500c8d5b tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "c645f7f5-528b-4719-96dd-8e50a46b4261-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1769.029442] env[60764]: DEBUG oslo_concurrency.lockutils [None req-689ac139-db04-40ab-b8a5-d067500c8d5b tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "c645f7f5-528b-4719-96dd-8e50a46b4261-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1769.031386] env[60764]: INFO nova.compute.manager [None req-689ac139-db04-40ab-b8a5-d067500c8d5b tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Terminating instance [ 1769.034015] env[60764]: DEBUG nova.compute.manager [None req-689ac139-db04-40ab-b8a5-d067500c8d5b tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1769.034357] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-689ac139-db04-40ab-b8a5-d067500c8d5b tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1769.034971] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3dcfb803-635d-488d-8f03-c19bc9efa624 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1769.039193] env[60764]: DEBUG nova.compute.manager [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1769.047726] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6a550f7-0694-48d1-bfe9-8444db5bb3f6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1769.077344] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-689ac139-db04-40ab-b8a5-d067500c8d5b tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c645f7f5-528b-4719-96dd-8e50a46b4261 could not be found. [ 1769.077564] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-689ac139-db04-40ab-b8a5-d067500c8d5b tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1769.077742] env[60764]: INFO nova.compute.manager [None req-689ac139-db04-40ab-b8a5-d067500c8d5b tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1769.077982] env[60764]: DEBUG oslo.service.loopingcall [None req-689ac139-db04-40ab-b8a5-d067500c8d5b tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1769.082492] env[60764]: DEBUG nova.compute.manager [-] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1769.082600] env[60764]: DEBUG nova.network.neutron [-] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1769.094278] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1769.094513] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1769.095921] env[60764]: INFO nova.compute.claims [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1769.115335] env[60764]: DEBUG nova.network.neutron [-] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1769.130369] env[60764]: INFO nova.compute.manager [-] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] Took 0.05 seconds to deallocate network for instance. [ 1769.213310] env[60764]: DEBUG oslo_concurrency.lockutils [None req-689ac139-db04-40ab-b8a5-d067500c8d5b tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "c645f7f5-528b-4719-96dd-8e50a46b4261" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.185s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1769.214531] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "c645f7f5-528b-4719-96dd-8e50a46b4261" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 65.947s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1769.214531] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c645f7f5-528b-4719-96dd-8e50a46b4261] During sync_power_state the instance has a pending task (deleting). Skip. [ 1769.214531] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "c645f7f5-528b-4719-96dd-8e50a46b4261" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1769.277601] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ad0982d-fbd1-49c5-913e-50952abb3b34 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1769.285446] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30b556a6-c871-4567-bf70-d9f0588fa807 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1769.316370] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6333dce7-17e9-4114-b560-4b8adb66296c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1769.323172] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d69db10c-68cb-413b-b7b1-d611578007ea {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1769.335872] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1769.336310] env[60764]: DEBUG nova.compute.provider_tree [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1769.346450] env[60764]: DEBUG nova.scheduler.client.report [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1769.359619] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1769.360092] env[60764]: DEBUG nova.compute.manager [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1769.395059] env[60764]: DEBUG nova.compute.utils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1769.396375] env[60764]: DEBUG nova.compute.manager [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1769.396554] env[60764]: DEBUG nova.network.neutron [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1769.406045] env[60764]: DEBUG nova.compute.manager [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1769.456030] env[60764]: DEBUG nova.policy [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd856b6f35251478d8ff7ac126f1557d3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '06346705ad3e441ba21ce436297e94d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1769.467087] env[60764]: DEBUG nova.compute.manager [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1769.491529] env[60764]: DEBUG nova.virt.hardware [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1769.491777] env[60764]: DEBUG nova.virt.hardware [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1769.491930] env[60764]: DEBUG nova.virt.hardware [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1769.492126] env[60764]: DEBUG nova.virt.hardware [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1769.492276] env[60764]: DEBUG nova.virt.hardware [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1769.492422] env[60764]: DEBUG nova.virt.hardware [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1769.492627] env[60764]: DEBUG nova.virt.hardware [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1769.492780] env[60764]: DEBUG nova.virt.hardware [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1769.492944] env[60764]: DEBUG nova.virt.hardware [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1769.493120] env[60764]: DEBUG nova.virt.hardware [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1769.493295] env[60764]: DEBUG nova.virt.hardware [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1769.494176] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19428062-186f-4edf-8f70-bbd468c9cc30 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1769.501751] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a285f2ed-8648-4dbd-9c06-e5dc0a47f16c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1769.738988] env[60764]: DEBUG nova.network.neutron [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Successfully created port: ea922daa-1d55-4604-9060-3c76abcf0a74 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1770.548068] env[60764]: DEBUG nova.network.neutron [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Successfully updated port: ea922daa-1d55-4604-9060-3c76abcf0a74 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1770.561883] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Acquiring lock "refresh_cache-55ca3e89-807f-473c-8b5b-346fc2ea23f8" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1770.562106] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Acquired lock "refresh_cache-55ca3e89-807f-473c-8b5b-346fc2ea23f8" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1770.562297] env[60764]: DEBUG nova.network.neutron [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1770.602357] env[60764]: DEBUG nova.network.neutron [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1770.761140] env[60764]: DEBUG nova.network.neutron [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Updating instance_info_cache with network_info: [{"id": "ea922daa-1d55-4604-9060-3c76abcf0a74", "address": "fa:16:3e:7a:04:8f", "network": {"id": "15988815-7f77-497b-8789-621dcedc277d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2493165-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "06346705ad3e441ba21ce436297e94d3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e2e8b74b-aa27-4f31-9414-7bcf531e8642", "external-id": "nsx-vlan-transportzone-544", "segmentation_id": 544, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapea922daa-1d", "ovs_interfaceid": "ea922daa-1d55-4604-9060-3c76abcf0a74", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1770.772182] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Releasing lock "refresh_cache-55ca3e89-807f-473c-8b5b-346fc2ea23f8" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1770.772460] env[60764]: DEBUG nova.compute.manager [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Instance network_info: |[{"id": "ea922daa-1d55-4604-9060-3c76abcf0a74", "address": "fa:16:3e:7a:04:8f", "network": {"id": "15988815-7f77-497b-8789-621dcedc277d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2493165-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "06346705ad3e441ba21ce436297e94d3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e2e8b74b-aa27-4f31-9414-7bcf531e8642", "external-id": "nsx-vlan-transportzone-544", "segmentation_id": 544, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapea922daa-1d", "ovs_interfaceid": "ea922daa-1d55-4604-9060-3c76abcf0a74", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1770.772848] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7a:04:8f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e2e8b74b-aa27-4f31-9414-7bcf531e8642', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ea922daa-1d55-4604-9060-3c76abcf0a74', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1770.780354] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Creating folder: Project (06346705ad3e441ba21ce436297e94d3). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1770.780844] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-74497970-840b-4dd3-b7e4-e95edf034fdb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1770.792948] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Created folder: Project (06346705ad3e441ba21ce436297e94d3) in parent group-v449629. [ 1770.793109] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Creating folder: Instances. Parent ref: group-v449737. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1770.793332] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0f726cc2-51c4-40ff-a18b-3e945a7883ec {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1770.801916] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Created folder: Instances in parent group-v449737. [ 1770.802148] env[60764]: DEBUG oslo.service.loopingcall [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1770.802326] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1770.802515] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e0c3a667-ae73-476c-b475-f1c47d7a1e89 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1770.820445] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1770.820445] env[60764]: value = "task-2205032" [ 1770.820445] env[60764]: _type = "Task" [ 1770.820445] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1770.827682] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205032, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1770.914178] env[60764]: DEBUG nova.compute.manager [req-0122f62a-d8aa-4e06-b12d-b285da0b2776 req-d778ab2c-78b1-4fce-ad29-cabf7d82cad8 service nova] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Received event network-vif-plugged-ea922daa-1d55-4604-9060-3c76abcf0a74 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1770.914333] env[60764]: DEBUG oslo_concurrency.lockutils [req-0122f62a-d8aa-4e06-b12d-b285da0b2776 req-d778ab2c-78b1-4fce-ad29-cabf7d82cad8 service nova] Acquiring lock "55ca3e89-807f-473c-8b5b-346fc2ea23f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1770.914540] env[60764]: DEBUG oslo_concurrency.lockutils [req-0122f62a-d8aa-4e06-b12d-b285da0b2776 req-d778ab2c-78b1-4fce-ad29-cabf7d82cad8 service nova] Lock "55ca3e89-807f-473c-8b5b-346fc2ea23f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1770.914697] env[60764]: DEBUG oslo_concurrency.lockutils [req-0122f62a-d8aa-4e06-b12d-b285da0b2776 req-d778ab2c-78b1-4fce-ad29-cabf7d82cad8 service nova] Lock "55ca3e89-807f-473c-8b5b-346fc2ea23f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1770.914861] env[60764]: DEBUG nova.compute.manager [req-0122f62a-d8aa-4e06-b12d-b285da0b2776 req-d778ab2c-78b1-4fce-ad29-cabf7d82cad8 service nova] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] No waiting events found dispatching network-vif-plugged-ea922daa-1d55-4604-9060-3c76abcf0a74 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1770.915034] env[60764]: WARNING nova.compute.manager [req-0122f62a-d8aa-4e06-b12d-b285da0b2776 req-d778ab2c-78b1-4fce-ad29-cabf7d82cad8 service nova] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Received unexpected event network-vif-plugged-ea922daa-1d55-4604-9060-3c76abcf0a74 for instance with vm_state building and task_state spawning. [ 1770.915211] env[60764]: DEBUG nova.compute.manager [req-0122f62a-d8aa-4e06-b12d-b285da0b2776 req-d778ab2c-78b1-4fce-ad29-cabf7d82cad8 service nova] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Received event network-changed-ea922daa-1d55-4604-9060-3c76abcf0a74 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1770.915408] env[60764]: DEBUG nova.compute.manager [req-0122f62a-d8aa-4e06-b12d-b285da0b2776 req-d778ab2c-78b1-4fce-ad29-cabf7d82cad8 service nova] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Refreshing instance network info cache due to event network-changed-ea922daa-1d55-4604-9060-3c76abcf0a74. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1770.915594] env[60764]: DEBUG oslo_concurrency.lockutils [req-0122f62a-d8aa-4e06-b12d-b285da0b2776 req-d778ab2c-78b1-4fce-ad29-cabf7d82cad8 service nova] Acquiring lock "refresh_cache-55ca3e89-807f-473c-8b5b-346fc2ea23f8" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1770.915726] env[60764]: DEBUG oslo_concurrency.lockutils [req-0122f62a-d8aa-4e06-b12d-b285da0b2776 req-d778ab2c-78b1-4fce-ad29-cabf7d82cad8 service nova] Acquired lock "refresh_cache-55ca3e89-807f-473c-8b5b-346fc2ea23f8" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1770.915879] env[60764]: DEBUG nova.network.neutron [req-0122f62a-d8aa-4e06-b12d-b285da0b2776 req-d778ab2c-78b1-4fce-ad29-cabf7d82cad8 service nova] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Refreshing network info cache for port ea922daa-1d55-4604-9060-3c76abcf0a74 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1771.162361] env[60764]: DEBUG nova.network.neutron [req-0122f62a-d8aa-4e06-b12d-b285da0b2776 req-d778ab2c-78b1-4fce-ad29-cabf7d82cad8 service nova] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Updated VIF entry in instance network info cache for port ea922daa-1d55-4604-9060-3c76abcf0a74. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1771.162780] env[60764]: DEBUG nova.network.neutron [req-0122f62a-d8aa-4e06-b12d-b285da0b2776 req-d778ab2c-78b1-4fce-ad29-cabf7d82cad8 service nova] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Updating instance_info_cache with network_info: [{"id": "ea922daa-1d55-4604-9060-3c76abcf0a74", "address": "fa:16:3e:7a:04:8f", "network": {"id": "15988815-7f77-497b-8789-621dcedc277d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2493165-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "06346705ad3e441ba21ce436297e94d3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e2e8b74b-aa27-4f31-9414-7bcf531e8642", "external-id": "nsx-vlan-transportzone-544", "segmentation_id": 544, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapea922daa-1d", "ovs_interfaceid": "ea922daa-1d55-4604-9060-3c76abcf0a74", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1771.172120] env[60764]: DEBUG oslo_concurrency.lockutils [req-0122f62a-d8aa-4e06-b12d-b285da0b2776 req-d778ab2c-78b1-4fce-ad29-cabf7d82cad8 service nova] Releasing lock "refresh_cache-55ca3e89-807f-473c-8b5b-346fc2ea23f8" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1771.330375] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205032, 'name': CreateVM_Task} progress is 99%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1771.830677] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205032, 'name': CreateVM_Task} progress is 99%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1772.331716] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205032, 'name': CreateVM_Task, 'duration_secs': 1.287998} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1772.331897] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1772.332606] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1772.332799] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1772.333115] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1772.333391] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-76fc8904-e4d0-441d-892f-2774de7ae8c3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1772.337700] env[60764]: DEBUG oslo_vmware.api [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Waiting for the task: (returnval){ [ 1772.337700] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52ea8d54-e5bc-ccde-843d-53b55f5132fa" [ 1772.337700] env[60764]: _type = "Task" [ 1772.337700] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1772.345468] env[60764]: DEBUG oslo_vmware.api [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52ea8d54-e5bc-ccde-843d-53b55f5132fa, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1772.847896] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1772.848213] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1772.848395] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1773.329896] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1773.329896] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1773.329896] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1774.326629] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1776.330460] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1791.311883] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b1655091-6afc-4582-95ca-5faaf5f3dcad tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Acquiring lock "55ca3e89-807f-473c-8b5b-346fc2ea23f8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1815.263949] env[60764]: WARNING oslo_vmware.rw_handles [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1815.263949] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1815.263949] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1815.263949] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1815.263949] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1815.263949] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1815.263949] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1815.263949] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1815.263949] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1815.263949] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1815.263949] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1815.263949] env[60764]: ERROR oslo_vmware.rw_handles [ 1815.264660] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/a5d17515-090e-4379-9afa-805d4fee3a50/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1815.266569] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1815.266842] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Copying Virtual Disk [datastore2] vmware_temp/a5d17515-090e-4379-9afa-805d4fee3a50/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/a5d17515-090e-4379-9afa-805d4fee3a50/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1815.267176] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-01f0a337-83d5-497d-8a27-c9df378fee1f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1815.275014] env[60764]: DEBUG oslo_vmware.api [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Waiting for the task: (returnval){ [ 1815.275014] env[60764]: value = "task-2205033" [ 1815.275014] env[60764]: _type = "Task" [ 1815.275014] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1815.282904] env[60764]: DEBUG oslo_vmware.api [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Task: {'id': task-2205033, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1815.786119] env[60764]: DEBUG oslo_vmware.exceptions [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1815.786402] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1815.786946] env[60764]: ERROR nova.compute.manager [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1815.786946] env[60764]: Faults: ['InvalidArgument'] [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Traceback (most recent call last): [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] yield resources [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] self.driver.spawn(context, instance, image_meta, [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] self._fetch_image_if_missing(context, vi) [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] image_cache(vi, tmp_image_ds_loc) [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] vm_util.copy_virtual_disk( [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] session._wait_for_task(vmdk_copy_task) [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] return self.wait_for_task(task_ref) [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] return evt.wait() [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] result = hub.switch() [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] return self.greenlet.switch() [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] self.f(*self.args, **self.kw) [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] raise exceptions.translate_fault(task_info.error) [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Faults: ['InvalidArgument'] [ 1815.786946] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] [ 1815.788026] env[60764]: INFO nova.compute.manager [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Terminating instance [ 1815.788848] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1815.789066] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1815.789316] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e8fc516b-dcc2-472a-870d-bc944b23fad8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1815.791458] env[60764]: DEBUG nova.compute.manager [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1815.791687] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1815.792426] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0741a8df-e0d2-4f4f-b5ff-a5037b0d5df9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1815.799356] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1815.799594] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4032da44-7fc4-4670-a08d-d6cdd25546f0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1815.801639] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1815.801816] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1815.802791] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aa043f12-0f32-41eb-a46f-c2e00f6155b6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1815.807560] env[60764]: DEBUG oslo_vmware.api [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Waiting for the task: (returnval){ [ 1815.807560] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52a691d8-8a9b-0bbb-73ee-7edb9671ed90" [ 1815.807560] env[60764]: _type = "Task" [ 1815.807560] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1815.814718] env[60764]: DEBUG oslo_vmware.api [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52a691d8-8a9b-0bbb-73ee-7edb9671ed90, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1815.870572] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1815.870572] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1815.870572] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Deleting the datastore file [datastore2] 51512549-4c6e-41d4-98b0-7d1e801a8b69 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1815.870852] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d37ec63a-07c1-4b1b-99a7-70fc31a68766 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1815.876443] env[60764]: DEBUG oslo_vmware.api [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Waiting for the task: (returnval){ [ 1815.876443] env[60764]: value = "task-2205035" [ 1815.876443] env[60764]: _type = "Task" [ 1815.876443] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1815.883803] env[60764]: DEBUG oslo_vmware.api [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Task: {'id': task-2205035, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1816.318069] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1816.318069] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Creating directory with path [datastore2] vmware_temp/e23754a7-3a0a-4743-b1b5-5dd2a5b4b04f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1816.318501] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c52aec1a-10d3-43ce-894e-14053c0e4aa0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1816.329648] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Created directory with path [datastore2] vmware_temp/e23754a7-3a0a-4743-b1b5-5dd2a5b4b04f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1816.329829] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Fetch image to [datastore2] vmware_temp/e23754a7-3a0a-4743-b1b5-5dd2a5b4b04f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1816.330015] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/e23754a7-3a0a-4743-b1b5-5dd2a5b4b04f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1816.330751] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ab74b76-d54c-4ab3-8665-ee47f821a8c1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1816.336948] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d16f919-6411-4834-a453-207bdd87ef60 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1816.345782] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a11afd0c-96be-4092-8896-4754c51712a2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1816.376651] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f65e649d-ad2e-49cf-b390-6f6a2be46e28 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1816.386337] env[60764]: DEBUG oslo_vmware.api [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Task: {'id': task-2205035, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077881} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1816.387442] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1816.387634] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1816.387803] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1816.387970] env[60764]: INFO nova.compute.manager [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1816.389984] env[60764]: DEBUG nova.compute.claims [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1816.390163] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1816.390384] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1816.392875] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1d522595-f688-4c0f-a922-d9088fb2b3d7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1816.414262] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1816.463102] env[60764]: DEBUG oslo_vmware.rw_handles [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e23754a7-3a0a-4743-b1b5-5dd2a5b4b04f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1816.524343] env[60764]: DEBUG oslo_vmware.rw_handles [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1816.524542] env[60764]: DEBUG oslo_vmware.rw_handles [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e23754a7-3a0a-4743-b1b5-5dd2a5b4b04f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1816.616977] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5298247f-1dbf-4cf4-aa46-a19e16f382ed {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1816.624509] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-042e6672-3470-4bb2-b815-c96251decdfc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1816.653575] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fde6402e-80f6-4fb5-acb6-e2f4bfea610b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1816.660732] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25dae8ee-942b-4779-9e8f-10afb12b1554 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1816.674303] env[60764]: DEBUG nova.compute.provider_tree [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1816.683756] env[60764]: DEBUG nova.scheduler.client.report [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1816.698864] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.308s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1816.699452] env[60764]: ERROR nova.compute.manager [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1816.699452] env[60764]: Faults: ['InvalidArgument'] [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Traceback (most recent call last): [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] self.driver.spawn(context, instance, image_meta, [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] self._fetch_image_if_missing(context, vi) [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] image_cache(vi, tmp_image_ds_loc) [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] vm_util.copy_virtual_disk( [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] session._wait_for_task(vmdk_copy_task) [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] return self.wait_for_task(task_ref) [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] return evt.wait() [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] result = hub.switch() [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] return self.greenlet.switch() [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] self.f(*self.args, **self.kw) [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] raise exceptions.translate_fault(task_info.error) [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Faults: ['InvalidArgument'] [ 1816.699452] env[60764]: ERROR nova.compute.manager [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] [ 1816.700243] env[60764]: DEBUG nova.compute.utils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1816.701462] env[60764]: DEBUG nova.compute.manager [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Build of instance 51512549-4c6e-41d4-98b0-7d1e801a8b69 was re-scheduled: A specified parameter was not correct: fileType [ 1816.701462] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1816.701827] env[60764]: DEBUG nova.compute.manager [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1816.701999] env[60764]: DEBUG nova.compute.manager [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1816.702195] env[60764]: DEBUG nova.compute.manager [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1816.702410] env[60764]: DEBUG nova.network.neutron [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1817.117914] env[60764]: DEBUG nova.network.neutron [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1817.130087] env[60764]: INFO nova.compute.manager [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Took 0.43 seconds to deallocate network for instance. [ 1817.227617] env[60764]: INFO nova.scheduler.client.report [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Deleted allocations for instance 51512549-4c6e-41d4-98b0-7d1e801a8b69 [ 1817.253972] env[60764]: DEBUG oslo_concurrency.lockutils [None req-dd13ea1d-bf63-4c9f-979c-e30ed3c3d172 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Lock "51512549-4c6e-41d4-98b0-7d1e801a8b69" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 627.030s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1817.255137] env[60764]: DEBUG oslo_concurrency.lockutils [None req-02bf7387-36eb-449f-88a5-40e5606e4850 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Lock "51512549-4c6e-41d4-98b0-7d1e801a8b69" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 431.960s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1817.255352] env[60764]: DEBUG oslo_concurrency.lockutils [None req-02bf7387-36eb-449f-88a5-40e5606e4850 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Acquiring lock "51512549-4c6e-41d4-98b0-7d1e801a8b69-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1817.255551] env[60764]: DEBUG oslo_concurrency.lockutils [None req-02bf7387-36eb-449f-88a5-40e5606e4850 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Lock "51512549-4c6e-41d4-98b0-7d1e801a8b69-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1817.255713] env[60764]: DEBUG oslo_concurrency.lockutils [None req-02bf7387-36eb-449f-88a5-40e5606e4850 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Lock "51512549-4c6e-41d4-98b0-7d1e801a8b69-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1817.257840] env[60764]: INFO nova.compute.manager [None req-02bf7387-36eb-449f-88a5-40e5606e4850 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Terminating instance [ 1817.259440] env[60764]: DEBUG nova.compute.manager [None req-02bf7387-36eb-449f-88a5-40e5606e4850 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1817.259661] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-02bf7387-36eb-449f-88a5-40e5606e4850 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1817.260142] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-55d7ef32-03b3-44fd-926e-99eed8026d36 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1817.270830] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb7ca4ce-abf6-4a03-8e64-c77806583f94 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1817.281465] env[60764]: DEBUG nova.compute.manager [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1817.302129] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-02bf7387-36eb-449f-88a5-40e5606e4850 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 51512549-4c6e-41d4-98b0-7d1e801a8b69 could not be found. [ 1817.302129] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-02bf7387-36eb-449f-88a5-40e5606e4850 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1817.302129] env[60764]: INFO nova.compute.manager [None req-02bf7387-36eb-449f-88a5-40e5606e4850 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1817.302129] env[60764]: DEBUG oslo.service.loopingcall [None req-02bf7387-36eb-449f-88a5-40e5606e4850 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1817.302314] env[60764]: DEBUG nova.compute.manager [-] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1817.302314] env[60764]: DEBUG nova.network.neutron [-] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1817.327865] env[60764]: DEBUG nova.network.neutron [-] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1817.334944] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1817.335239] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1817.336648] env[60764]: INFO nova.compute.claims [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1817.339665] env[60764]: INFO nova.compute.manager [-] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] Took 0.04 seconds to deallocate network for instance. [ 1817.427981] env[60764]: DEBUG oslo_concurrency.lockutils [None req-02bf7387-36eb-449f-88a5-40e5606e4850 tempest-ServerGroupTestJSON-880341431 tempest-ServerGroupTestJSON-880341431-project-member] Lock "51512549-4c6e-41d4-98b0-7d1e801a8b69" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.173s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1817.431039] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "51512549-4c6e-41d4-98b0-7d1e801a8b69" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 114.164s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1817.431251] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 51512549-4c6e-41d4-98b0-7d1e801a8b69] During sync_power_state the instance has a pending task (deleting). Skip. [ 1817.431431] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "51512549-4c6e-41d4-98b0-7d1e801a8b69" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1817.506261] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4831330-3241-41aa-b975-a5740a5dcf46 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1817.514230] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90b72201-6925-4e60-a0d6-d665ea60caf8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1817.544557] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce850178-3325-4522-935f-e776cd8994b6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1817.551206] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53ee04f9-806e-443d-8d94-1f523c497111 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1817.563962] env[60764]: DEBUG nova.compute.provider_tree [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1817.572880] env[60764]: DEBUG nova.scheduler.client.report [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1817.586388] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.251s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1817.586838] env[60764]: DEBUG nova.compute.manager [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1817.616717] env[60764]: DEBUG nova.compute.utils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1817.618258] env[60764]: DEBUG nova.compute.manager [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1817.618364] env[60764]: DEBUG nova.network.neutron [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1817.627413] env[60764]: DEBUG nova.compute.manager [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1817.685493] env[60764]: DEBUG nova.policy [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1154fa431dad4ae1ae467fc3ea6206b1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4c4f5a1b557e4c31b54b7f87223a20d8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1817.696344] env[60764]: DEBUG nova.compute.manager [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1817.723314] env[60764]: DEBUG nova.virt.hardware [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1817.723561] env[60764]: DEBUG nova.virt.hardware [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1817.723721] env[60764]: DEBUG nova.virt.hardware [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1817.723897] env[60764]: DEBUG nova.virt.hardware [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1817.724050] env[60764]: DEBUG nova.virt.hardware [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1817.724457] env[60764]: DEBUG nova.virt.hardware [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1817.724457] env[60764]: DEBUG nova.virt.hardware [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1817.724569] env[60764]: DEBUG nova.virt.hardware [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1817.724701] env[60764]: DEBUG nova.virt.hardware [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1817.724860] env[60764]: DEBUG nova.virt.hardware [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1817.725044] env[60764]: DEBUG nova.virt.hardware [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1817.725970] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4733c04c-dd1f-4075-9618-1467b47ebbdd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1817.734294] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ac07c51-3c01-4ec1-ba36-904f17ee81e5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1818.003160] env[60764]: DEBUG nova.network.neutron [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Successfully created port: e3799134-5323-4074-8595-a60d220e7491 {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1818.617891] env[60764]: DEBUG nova.network.neutron [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Successfully updated port: e3799134-5323-4074-8595-a60d220e7491 {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1818.629572] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "refresh_cache-aa9f1e61-ac26-495c-a698-5163661401a5" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1818.629572] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquired lock "refresh_cache-aa9f1e61-ac26-495c-a698-5163661401a5" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1818.629572] env[60764]: DEBUG nova.network.neutron [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1818.666679] env[60764]: DEBUG nova.network.neutron [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1819.051129] env[60764]: DEBUG nova.network.neutron [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Updating instance_info_cache with network_info: [{"id": "e3799134-5323-4074-8595-a60d220e7491", "address": "fa:16:3e:84:51:8f", "network": {"id": "d045d290-b078-42dc-b8c5-cc9de065ce4b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1646899007-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4c4f5a1b557e4c31b54b7f87223a20d8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d177c5b3-a5b1-4c78-854e-7e0dbf341ea1", "external-id": "nsx-vlan-transportzone-54", "segmentation_id": 54, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape3799134-53", "ovs_interfaceid": "e3799134-5323-4074-8595-a60d220e7491", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1819.062806] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Releasing lock "refresh_cache-aa9f1e61-ac26-495c-a698-5163661401a5" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1819.063120] env[60764]: DEBUG nova.compute.manager [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Instance network_info: |[{"id": "e3799134-5323-4074-8595-a60d220e7491", "address": "fa:16:3e:84:51:8f", "network": {"id": "d045d290-b078-42dc-b8c5-cc9de065ce4b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1646899007-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4c4f5a1b557e4c31b54b7f87223a20d8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d177c5b3-a5b1-4c78-854e-7e0dbf341ea1", "external-id": "nsx-vlan-transportzone-54", "segmentation_id": 54, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape3799134-53", "ovs_interfaceid": "e3799134-5323-4074-8595-a60d220e7491", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1819.063521] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:84:51:8f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd177c5b3-a5b1-4c78-854e-7e0dbf341ea1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e3799134-5323-4074-8595-a60d220e7491', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1819.071300] env[60764]: DEBUG oslo.service.loopingcall [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1819.071764] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1819.072013] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6bc9739e-9067-49b8-b4a6-d16a29e3026a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1819.092898] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1819.092898] env[60764]: value = "task-2205036" [ 1819.092898] env[60764]: _type = "Task" [ 1819.092898] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1819.100357] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205036, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1819.159554] env[60764]: DEBUG nova.compute.manager [req-658499ed-8921-4b76-bca3-bcdfd5b38257 req-4994add2-130e-4250-9cee-65af6c767cc2 service nova] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Received event network-vif-plugged-e3799134-5323-4074-8595-a60d220e7491 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1819.159875] env[60764]: DEBUG oslo_concurrency.lockutils [req-658499ed-8921-4b76-bca3-bcdfd5b38257 req-4994add2-130e-4250-9cee-65af6c767cc2 service nova] Acquiring lock "aa9f1e61-ac26-495c-a698-5163661401a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1819.160118] env[60764]: DEBUG oslo_concurrency.lockutils [req-658499ed-8921-4b76-bca3-bcdfd5b38257 req-4994add2-130e-4250-9cee-65af6c767cc2 service nova] Lock "aa9f1e61-ac26-495c-a698-5163661401a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1819.160310] env[60764]: DEBUG oslo_concurrency.lockutils [req-658499ed-8921-4b76-bca3-bcdfd5b38257 req-4994add2-130e-4250-9cee-65af6c767cc2 service nova] Lock "aa9f1e61-ac26-495c-a698-5163661401a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1819.160518] env[60764]: DEBUG nova.compute.manager [req-658499ed-8921-4b76-bca3-bcdfd5b38257 req-4994add2-130e-4250-9cee-65af6c767cc2 service nova] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] No waiting events found dispatching network-vif-plugged-e3799134-5323-4074-8595-a60d220e7491 {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1819.160727] env[60764]: WARNING nova.compute.manager [req-658499ed-8921-4b76-bca3-bcdfd5b38257 req-4994add2-130e-4250-9cee-65af6c767cc2 service nova] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Received unexpected event network-vif-plugged-e3799134-5323-4074-8595-a60d220e7491 for instance with vm_state building and task_state spawning. [ 1819.160892] env[60764]: DEBUG nova.compute.manager [req-658499ed-8921-4b76-bca3-bcdfd5b38257 req-4994add2-130e-4250-9cee-65af6c767cc2 service nova] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Received event network-changed-e3799134-5323-4074-8595-a60d220e7491 {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1819.161107] env[60764]: DEBUG nova.compute.manager [req-658499ed-8921-4b76-bca3-bcdfd5b38257 req-4994add2-130e-4250-9cee-65af6c767cc2 service nova] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Refreshing instance network info cache due to event network-changed-e3799134-5323-4074-8595-a60d220e7491. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1819.161307] env[60764]: DEBUG oslo_concurrency.lockutils [req-658499ed-8921-4b76-bca3-bcdfd5b38257 req-4994add2-130e-4250-9cee-65af6c767cc2 service nova] Acquiring lock "refresh_cache-aa9f1e61-ac26-495c-a698-5163661401a5" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1819.161447] env[60764]: DEBUG oslo_concurrency.lockutils [req-658499ed-8921-4b76-bca3-bcdfd5b38257 req-4994add2-130e-4250-9cee-65af6c767cc2 service nova] Acquired lock "refresh_cache-aa9f1e61-ac26-495c-a698-5163661401a5" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1819.161603] env[60764]: DEBUG nova.network.neutron [req-658499ed-8921-4b76-bca3-bcdfd5b38257 req-4994add2-130e-4250-9cee-65af6c767cc2 service nova] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Refreshing network info cache for port e3799134-5323-4074-8595-a60d220e7491 {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1819.454203] env[60764]: DEBUG nova.network.neutron [req-658499ed-8921-4b76-bca3-bcdfd5b38257 req-4994add2-130e-4250-9cee-65af6c767cc2 service nova] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Updated VIF entry in instance network info cache for port e3799134-5323-4074-8595-a60d220e7491. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1819.454535] env[60764]: DEBUG nova.network.neutron [req-658499ed-8921-4b76-bca3-bcdfd5b38257 req-4994add2-130e-4250-9cee-65af6c767cc2 service nova] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Updating instance_info_cache with network_info: [{"id": "e3799134-5323-4074-8595-a60d220e7491", "address": "fa:16:3e:84:51:8f", "network": {"id": "d045d290-b078-42dc-b8c5-cc9de065ce4b", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1646899007-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4c4f5a1b557e4c31b54b7f87223a20d8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d177c5b3-a5b1-4c78-854e-7e0dbf341ea1", "external-id": "nsx-vlan-transportzone-54", "segmentation_id": 54, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape3799134-53", "ovs_interfaceid": "e3799134-5323-4074-8595-a60d220e7491", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1819.463920] env[60764]: DEBUG oslo_concurrency.lockutils [req-658499ed-8921-4b76-bca3-bcdfd5b38257 req-4994add2-130e-4250-9cee-65af6c767cc2 service nova] Releasing lock "refresh_cache-aa9f1e61-ac26-495c-a698-5163661401a5" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1819.602840] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205036, 'name': CreateVM_Task, 'duration_secs': 0.275986} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1819.602999] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1819.609420] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1819.609581] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1819.609941] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1819.610191] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-921e6d56-5026-4b78-9f6a-82c8adeeda4e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1819.614273] env[60764]: DEBUG oslo_vmware.api [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Waiting for the task: (returnval){ [ 1819.614273] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]528e4dbf-1d54-cc65-7959-4da5f15705b6" [ 1819.614273] env[60764]: _type = "Task" [ 1819.614273] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1819.621581] env[60764]: DEBUG oslo_vmware.api [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]528e4dbf-1d54-cc65-7959-4da5f15705b6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1820.123962] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1820.124230] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1820.124439] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1825.331110] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1825.331110] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1825.331110] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1825.352087] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1825.352246] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1825.352449] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1825.352598] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1825.352725] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1825.352845] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1825.352962] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1825.353094] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1825.353214] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1825.353330] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1825.353445] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1827.331128] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1827.341894] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1827.342151] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1827.342343] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1827.342498] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1827.343655] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed162ce4-ea5c-496b-a64e-cfec3eff57d2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1827.352451] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9659a14-b802-46f8-a454-aae658cb0a78 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1827.366592] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08981394-7dee-4d86-ac85-fd944c3a5fe7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1827.372814] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3ca4b41-ade6-4c71-8aef-5ea2909eeab6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1827.402544] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181265MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1827.402688] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1827.402871] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1827.474514] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance bf522599-8aa5-411a-96dd-8bd8328d9156 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1827.474674] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1827.474803] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a83f4609-4c09-4056-a840-cd899af93ea3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1827.474926] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e96a6b8e-75b7-4a2f-a838-107603ad8b80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1827.475063] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2ea05216-40c5-4482-a1d8-278f7ea3d28b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1827.475189] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a6272c75-92de-45a0-8e3e-82e342f0475c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1827.475304] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 73ba3af8-9a29-4c63-9a55-c9879e74239d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1827.475418] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f1940470-82f6-41fb-bd36-96561ad20102 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1827.475531] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 55ca3e89-807f-473c-8b5b-346fc2ea23f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1827.475642] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aa9f1e61-ac26-495c-a698-5163661401a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1827.475823] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1827.475969] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1827.595893] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63adc085-43d5-47f8-8778-aa7b93fad8ef {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1827.603585] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0756ea68-db6e-4dbc-b915-b9d2a56d93d8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1827.631965] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-941b92c2-3e75-409b-ad7d-da0ec2597c95 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1827.638346] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dde36a20-bd36-42dd-b894-0140b88ab06c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1827.651776] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1827.659388] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1827.672543] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1827.672720] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.270s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1828.671387] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1828.671685] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1830.331145] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1831.325478] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1834.330158] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1834.330589] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1834.330689] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1834.330749] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1838.331463] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1862.164192] env[60764]: WARNING oslo_vmware.rw_handles [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1862.164192] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1862.164192] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1862.164192] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1862.164192] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1862.164192] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1862.164192] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1862.164192] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1862.164192] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1862.164192] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1862.164192] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1862.164192] env[60764]: ERROR oslo_vmware.rw_handles [ 1862.164820] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/e23754a7-3a0a-4743-b1b5-5dd2a5b4b04f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1862.166766] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1862.167067] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Copying Virtual Disk [datastore2] vmware_temp/e23754a7-3a0a-4743-b1b5-5dd2a5b4b04f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/e23754a7-3a0a-4743-b1b5-5dd2a5b4b04f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1862.167390] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0e8a3dbe-4762-422d-afa3-9937ac2ef56e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1862.175383] env[60764]: DEBUG oslo_vmware.api [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Waiting for the task: (returnval){ [ 1862.175383] env[60764]: value = "task-2205037" [ 1862.175383] env[60764]: _type = "Task" [ 1862.175383] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1862.183205] env[60764]: DEBUG oslo_vmware.api [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Task: {'id': task-2205037, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1862.685693] env[60764]: DEBUG oslo_vmware.exceptions [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1862.685988] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1862.686583] env[60764]: ERROR nova.compute.manager [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1862.686583] env[60764]: Faults: ['InvalidArgument'] [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Traceback (most recent call last): [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] yield resources [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] self.driver.spawn(context, instance, image_meta, [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] self._fetch_image_if_missing(context, vi) [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] image_cache(vi, tmp_image_ds_loc) [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] vm_util.copy_virtual_disk( [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] session._wait_for_task(vmdk_copy_task) [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] return self.wait_for_task(task_ref) [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] return evt.wait() [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] result = hub.switch() [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] return self.greenlet.switch() [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] self.f(*self.args, **self.kw) [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] raise exceptions.translate_fault(task_info.error) [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Faults: ['InvalidArgument'] [ 1862.686583] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] [ 1862.687918] env[60764]: INFO nova.compute.manager [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Terminating instance [ 1862.688428] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1862.688639] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1862.688875] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fd4fe85e-69b6-496d-964f-da0fa1c70a7c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1862.691086] env[60764]: DEBUG nova.compute.manager [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1862.691282] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1862.692044] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c74149c-20e1-4f52-9f5d-d3301789eb15 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1862.698886] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1862.699141] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0e9dfde3-90b2-402e-abd7-91fddc8c6dc7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1862.701288] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1862.701463] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1862.702416] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2cfc4914-91f4-4e42-b65c-bc7f2066cdc6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1862.706951] env[60764]: DEBUG oslo_vmware.api [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Waiting for the task: (returnval){ [ 1862.706951] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52fc0c1f-4a4b-2335-c2bc-16f537f57b28" [ 1862.706951] env[60764]: _type = "Task" [ 1862.706951] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1862.715617] env[60764]: DEBUG oslo_vmware.api [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52fc0c1f-4a4b-2335-c2bc-16f537f57b28, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1862.771328] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1862.771555] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1862.771728] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Deleting the datastore file [datastore2] bf522599-8aa5-411a-96dd-8bd8328d9156 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1862.772015] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e84f9a27-410b-4ef4-ad28-e55d847f3b9b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1862.777767] env[60764]: DEBUG oslo_vmware.api [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Waiting for the task: (returnval){ [ 1862.777767] env[60764]: value = "task-2205039" [ 1862.777767] env[60764]: _type = "Task" [ 1862.777767] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1862.785045] env[60764]: DEBUG oslo_vmware.api [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Task: {'id': task-2205039, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1863.216967] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1863.217416] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Creating directory with path [datastore2] vmware_temp/045858d3-0fcf-4d66-bbaf-94d80dbf4fa3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1863.217484] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d8918afe-210c-45bd-870d-9f8ca52b63d8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1863.229693] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Created directory with path [datastore2] vmware_temp/045858d3-0fcf-4d66-bbaf-94d80dbf4fa3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1863.229878] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Fetch image to [datastore2] vmware_temp/045858d3-0fcf-4d66-bbaf-94d80dbf4fa3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1863.230069] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/045858d3-0fcf-4d66-bbaf-94d80dbf4fa3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1863.230806] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87189c7f-32a7-4fe5-a4c7-b15840d93acf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1863.237276] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f7b78ee-00f9-4a32-b4dc-9fa5e3317ffa {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1863.247352] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0dfa3c27-d58c-48fa-9a75-8e0f03dfb691 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1863.278131] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4957a88d-6205-4f70-ada6-07f31799f959 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1863.289097] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-457bbd2c-cc1f-4862-844c-fe52d7e64f10 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1863.290716] env[60764]: DEBUG oslo_vmware.api [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Task: {'id': task-2205039, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07433} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1863.290953] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1863.291183] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1863.291377] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1863.291547] env[60764]: INFO nova.compute.manager [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1863.293714] env[60764]: DEBUG nova.compute.claims [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1863.293879] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1863.294103] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1863.314796] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1863.444137] env[60764]: DEBUG oslo_vmware.rw_handles [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/045858d3-0fcf-4d66-bbaf-94d80dbf4fa3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1863.504269] env[60764]: DEBUG oslo_vmware.rw_handles [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1863.504459] env[60764]: DEBUG oslo_vmware.rw_handles [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/045858d3-0fcf-4d66-bbaf-94d80dbf4fa3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1863.513722] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10cb7b9d-0be3-4d9c-a89a-6f92aecd9e32 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1863.521837] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c77913b7-3e8f-4938-8b51-56b6bfb61571 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1863.552263] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f089a87-b1b3-4bc5-b6f9-2ae2338c7cee {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1863.559083] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34d81fe1-a746-444e-91dd-1f759130338b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1863.571976] env[60764]: DEBUG nova.compute.provider_tree [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1863.581585] env[60764]: DEBUG nova.scheduler.client.report [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1863.595857] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.301s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1863.595857] env[60764]: ERROR nova.compute.manager [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1863.595857] env[60764]: Faults: ['InvalidArgument'] [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Traceback (most recent call last): [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] self.driver.spawn(context, instance, image_meta, [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] self._fetch_image_if_missing(context, vi) [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] image_cache(vi, tmp_image_ds_loc) [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] vm_util.copy_virtual_disk( [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] session._wait_for_task(vmdk_copy_task) [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] return self.wait_for_task(task_ref) [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] return evt.wait() [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] result = hub.switch() [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] return self.greenlet.switch() [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] self.f(*self.args, **self.kw) [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] raise exceptions.translate_fault(task_info.error) [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Faults: ['InvalidArgument'] [ 1863.595857] env[60764]: ERROR nova.compute.manager [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] [ 1863.596761] env[60764]: DEBUG nova.compute.utils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1863.597727] env[60764]: DEBUG nova.compute.manager [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Build of instance bf522599-8aa5-411a-96dd-8bd8328d9156 was re-scheduled: A specified parameter was not correct: fileType [ 1863.597727] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1863.598112] env[60764]: DEBUG nova.compute.manager [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1863.598292] env[60764]: DEBUG nova.compute.manager [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1863.598452] env[60764]: DEBUG nova.compute.manager [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1863.598616] env[60764]: DEBUG nova.network.neutron [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1864.068763] env[60764]: DEBUG nova.network.neutron [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1864.080784] env[60764]: INFO nova.compute.manager [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Took 0.48 seconds to deallocate network for instance. [ 1864.170531] env[60764]: INFO nova.scheduler.client.report [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Deleted allocations for instance bf522599-8aa5-411a-96dd-8bd8328d9156 [ 1864.189670] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40cedf93-7f23-4b7a-a063-2081c5f6ec4d tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Lock "bf522599-8aa5-411a-96dd-8bd8328d9156" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 620.710s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1864.189941] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ece7041d-64b9-439f-ab83-7a083cf0c8dd tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Lock "bf522599-8aa5-411a-96dd-8bd8328d9156" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 424.867s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1864.190199] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ece7041d-64b9-439f-ab83-7a083cf0c8dd tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Acquiring lock "bf522599-8aa5-411a-96dd-8bd8328d9156-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1864.190408] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ece7041d-64b9-439f-ab83-7a083cf0c8dd tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Lock "bf522599-8aa5-411a-96dd-8bd8328d9156-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1864.190570] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ece7041d-64b9-439f-ab83-7a083cf0c8dd tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Lock "bf522599-8aa5-411a-96dd-8bd8328d9156-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1864.192457] env[60764]: INFO nova.compute.manager [None req-ece7041d-64b9-439f-ab83-7a083cf0c8dd tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Terminating instance [ 1864.194543] env[60764]: DEBUG nova.compute.manager [None req-ece7041d-64b9-439f-ab83-7a083cf0c8dd tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1864.194757] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ece7041d-64b9-439f-ab83-7a083cf0c8dd tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1864.195241] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d2ca8830-2940-4987-a35a-e940b10358b8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1864.204709] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c55e350f-744c-4973-ad4d-02b1212cb224 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1864.237771] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-ece7041d-64b9-439f-ab83-7a083cf0c8dd tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance bf522599-8aa5-411a-96dd-8bd8328d9156 could not be found. [ 1864.238042] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ece7041d-64b9-439f-ab83-7a083cf0c8dd tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1864.238206] env[60764]: INFO nova.compute.manager [None req-ece7041d-64b9-439f-ab83-7a083cf0c8dd tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1864.238465] env[60764]: DEBUG oslo.service.loopingcall [None req-ece7041d-64b9-439f-ab83-7a083cf0c8dd tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1864.238679] env[60764]: DEBUG nova.compute.manager [-] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1864.238778] env[60764]: DEBUG nova.network.neutron [-] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1864.271985] env[60764]: DEBUG nova.network.neutron [-] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1864.280159] env[60764]: INFO nova.compute.manager [-] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] Took 0.04 seconds to deallocate network for instance. [ 1864.368735] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ece7041d-64b9-439f-ab83-7a083cf0c8dd tempest-ServerRescueTestJSON-52309402 tempest-ServerRescueTestJSON-52309402-project-member] Lock "bf522599-8aa5-411a-96dd-8bd8328d9156" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.179s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1864.369633] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "bf522599-8aa5-411a-96dd-8bd8328d9156" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 161.102s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1864.369818] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: bf522599-8aa5-411a-96dd-8bd8328d9156] During sync_power_state the instance has a pending task (deleting). Skip. [ 1864.370061] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "bf522599-8aa5-411a-96dd-8bd8328d9156" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1874.447088] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "16597080-42c7-40df-9893-38751d9ac11a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1874.447088] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "16597080-42c7-40df-9893-38751d9ac11a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1874.458156] env[60764]: DEBUG nova.compute.manager [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1874.505436] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1874.505684] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1874.507106] env[60764]: INFO nova.compute.claims [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1874.659037] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64f44351-7162-4166-b739-a8086c9724de {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1874.666725] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf967123-7766-4507-9957-f04eff925ef4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1874.696656] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a61594a5-8b27-45b7-a283-e4a0e5006af7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1874.703831] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30f431ff-6c99-4626-9ce8-87fe5fc12019 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1874.718238] env[60764]: DEBUG nova.compute.provider_tree [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1874.726964] env[60764]: DEBUG nova.scheduler.client.report [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1874.740444] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.234s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1874.740444] env[60764]: DEBUG nova.compute.manager [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1874.773076] env[60764]: DEBUG nova.compute.utils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1874.774543] env[60764]: DEBUG nova.compute.manager [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1874.774712] env[60764]: DEBUG nova.network.neutron [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1874.784473] env[60764]: DEBUG nova.compute.manager [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1874.838064] env[60764]: DEBUG nova.policy [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8165c7e326c4016a42ba39f68abfce6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5ed1c9589f44a86909b417fac99dab5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 1874.851752] env[60764]: DEBUG nova.compute.manager [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1874.876526] env[60764]: DEBUG nova.virt.hardware [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1874.876768] env[60764]: DEBUG nova.virt.hardware [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1874.876922] env[60764]: DEBUG nova.virt.hardware [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1874.877121] env[60764]: DEBUG nova.virt.hardware [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1874.877269] env[60764]: DEBUG nova.virt.hardware [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1874.877412] env[60764]: DEBUG nova.virt.hardware [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1874.877710] env[60764]: DEBUG nova.virt.hardware [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1874.877896] env[60764]: DEBUG nova.virt.hardware [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1874.878084] env[60764]: DEBUG nova.virt.hardware [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1874.878247] env[60764]: DEBUG nova.virt.hardware [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1874.878414] env[60764]: DEBUG nova.virt.hardware [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1874.879274] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4953c273-c123-4960-be56-6654a0477adf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1874.887516] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8177b5a-fc6a-4118-9a33-04e33a374d43 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1875.184730] env[60764]: DEBUG nova.network.neutron [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Successfully created port: 6e28d94d-0fd2-4c13-8454-75d44e699e6d {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1875.751079] env[60764]: DEBUG nova.compute.manager [req-db90beab-46f2-48ba-a8e3-6f51c28674c9 req-ac075cb2-2bb0-4612-9276-1f8046960e62 service nova] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Received event network-vif-plugged-6e28d94d-0fd2-4c13-8454-75d44e699e6d {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1875.751361] env[60764]: DEBUG oslo_concurrency.lockutils [req-db90beab-46f2-48ba-a8e3-6f51c28674c9 req-ac075cb2-2bb0-4612-9276-1f8046960e62 service nova] Acquiring lock "16597080-42c7-40df-9893-38751d9ac11a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1875.751616] env[60764]: DEBUG oslo_concurrency.lockutils [req-db90beab-46f2-48ba-a8e3-6f51c28674c9 req-ac075cb2-2bb0-4612-9276-1f8046960e62 service nova] Lock "16597080-42c7-40df-9893-38751d9ac11a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1875.751809] env[60764]: DEBUG oslo_concurrency.lockutils [req-db90beab-46f2-48ba-a8e3-6f51c28674c9 req-ac075cb2-2bb0-4612-9276-1f8046960e62 service nova] Lock "16597080-42c7-40df-9893-38751d9ac11a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1875.752009] env[60764]: DEBUG nova.compute.manager [req-db90beab-46f2-48ba-a8e3-6f51c28674c9 req-ac075cb2-2bb0-4612-9276-1f8046960e62 service nova] [instance: 16597080-42c7-40df-9893-38751d9ac11a] No waiting events found dispatching network-vif-plugged-6e28d94d-0fd2-4c13-8454-75d44e699e6d {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1875.752331] env[60764]: WARNING nova.compute.manager [req-db90beab-46f2-48ba-a8e3-6f51c28674c9 req-ac075cb2-2bb0-4612-9276-1f8046960e62 service nova] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Received unexpected event network-vif-plugged-6e28d94d-0fd2-4c13-8454-75d44e699e6d for instance with vm_state building and task_state spawning. [ 1875.833929] env[60764]: DEBUG nova.network.neutron [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Successfully updated port: 6e28d94d-0fd2-4c13-8454-75d44e699e6d {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1875.850260] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "refresh_cache-16597080-42c7-40df-9893-38751d9ac11a" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1875.850403] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquired lock "refresh_cache-16597080-42c7-40df-9893-38751d9ac11a" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1875.850552] env[60764]: DEBUG nova.network.neutron [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1875.907721] env[60764]: DEBUG nova.network.neutron [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1876.085929] env[60764]: DEBUG nova.network.neutron [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Updating instance_info_cache with network_info: [{"id": "6e28d94d-0fd2-4c13-8454-75d44e699e6d", "address": "fa:16:3e:6b:78:e8", "network": {"id": "6fca304c-0605-4df1-816c-6a41d3a44163", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1810377619-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b5ed1c9589f44a86909b417fac99dab5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "32463b6d-4569-4755-8a29-873a028690a7", "external-id": "nsx-vlan-transportzone-349", "segmentation_id": 349, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6e28d94d-0f", "ovs_interfaceid": "6e28d94d-0fd2-4c13-8454-75d44e699e6d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1876.098582] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Releasing lock "refresh_cache-16597080-42c7-40df-9893-38751d9ac11a" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1876.098866] env[60764]: DEBUG nova.compute.manager [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Instance network_info: |[{"id": "6e28d94d-0fd2-4c13-8454-75d44e699e6d", "address": "fa:16:3e:6b:78:e8", "network": {"id": "6fca304c-0605-4df1-816c-6a41d3a44163", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1810377619-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b5ed1c9589f44a86909b417fac99dab5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "32463b6d-4569-4755-8a29-873a028690a7", "external-id": "nsx-vlan-transportzone-349", "segmentation_id": 349, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6e28d94d-0f", "ovs_interfaceid": "6e28d94d-0fd2-4c13-8454-75d44e699e6d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1876.099266] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6b:78:e8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '32463b6d-4569-4755-8a29-873a028690a7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6e28d94d-0fd2-4c13-8454-75d44e699e6d', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1876.107058] env[60764]: DEBUG oslo.service.loopingcall [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1876.107527] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1876.107755] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-09dac3fd-e5c1-4498-b25b-1754a525633f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1876.128160] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1876.128160] env[60764]: value = "task-2205040" [ 1876.128160] env[60764]: _type = "Task" [ 1876.128160] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1876.136016] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205040, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1876.638657] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205040, 'name': CreateVM_Task, 'duration_secs': 0.320875} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1876.638833] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1876.639563] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1876.639731] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1876.640064] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1876.640318] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9577d79f-62ae-4138-9f7e-a0281521e3fc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1876.644929] env[60764]: DEBUG oslo_vmware.api [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for the task: (returnval){ [ 1876.644929] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]520618d1-1007-5eea-4bd5-11a61ad556c0" [ 1876.644929] env[60764]: _type = "Task" [ 1876.644929] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1876.651932] env[60764]: DEBUG oslo_vmware.api [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]520618d1-1007-5eea-4bd5-11a61ad556c0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1877.157034] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1877.157977] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1877.157977] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1877.778741] env[60764]: DEBUG nova.compute.manager [req-7b71f6ca-1710-4d3d-9141-0d244414aace req-3d28474d-c45c-4863-8b72-2bf58104f15a service nova] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Received event network-changed-6e28d94d-0fd2-4c13-8454-75d44e699e6d {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1877.778936] env[60764]: DEBUG nova.compute.manager [req-7b71f6ca-1710-4d3d-9141-0d244414aace req-3d28474d-c45c-4863-8b72-2bf58104f15a service nova] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Refreshing instance network info cache due to event network-changed-6e28d94d-0fd2-4c13-8454-75d44e699e6d. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1877.779176] env[60764]: DEBUG oslo_concurrency.lockutils [req-7b71f6ca-1710-4d3d-9141-0d244414aace req-3d28474d-c45c-4863-8b72-2bf58104f15a service nova] Acquiring lock "refresh_cache-16597080-42c7-40df-9893-38751d9ac11a" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1877.779320] env[60764]: DEBUG oslo_concurrency.lockutils [req-7b71f6ca-1710-4d3d-9141-0d244414aace req-3d28474d-c45c-4863-8b72-2bf58104f15a service nova] Acquired lock "refresh_cache-16597080-42c7-40df-9893-38751d9ac11a" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1877.779515] env[60764]: DEBUG nova.network.neutron [req-7b71f6ca-1710-4d3d-9141-0d244414aace req-3d28474d-c45c-4863-8b72-2bf58104f15a service nova] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Refreshing network info cache for port 6e28d94d-0fd2-4c13-8454-75d44e699e6d {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1878.003187] env[60764]: DEBUG nova.network.neutron [req-7b71f6ca-1710-4d3d-9141-0d244414aace req-3d28474d-c45c-4863-8b72-2bf58104f15a service nova] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Updated VIF entry in instance network info cache for port 6e28d94d-0fd2-4c13-8454-75d44e699e6d. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1878.003549] env[60764]: DEBUG nova.network.neutron [req-7b71f6ca-1710-4d3d-9141-0d244414aace req-3d28474d-c45c-4863-8b72-2bf58104f15a service nova] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Updating instance_info_cache with network_info: [{"id": "6e28d94d-0fd2-4c13-8454-75d44e699e6d", "address": "fa:16:3e:6b:78:e8", "network": {"id": "6fca304c-0605-4df1-816c-6a41d3a44163", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1810377619-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b5ed1c9589f44a86909b417fac99dab5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "32463b6d-4569-4755-8a29-873a028690a7", "external-id": "nsx-vlan-transportzone-349", "segmentation_id": 349, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6e28d94d-0f", "ovs_interfaceid": "6e28d94d-0fd2-4c13-8454-75d44e699e6d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1878.013104] env[60764]: DEBUG oslo_concurrency.lockutils [req-7b71f6ca-1710-4d3d-9141-0d244414aace req-3d28474d-c45c-4863-8b72-2bf58104f15a service nova] Releasing lock "refresh_cache-16597080-42c7-40df-9893-38751d9ac11a" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1885.331017] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1885.331358] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1885.331480] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1885.354265] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1885.354453] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1885.354662] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1885.354864] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1885.355076] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1885.355276] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1885.355473] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1885.355819] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1885.355921] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1885.356069] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1885.356270] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1889.331033] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1889.342839] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1889.343088] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1889.343260] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1889.343416] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1889.344507] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ea91cc5-861f-407f-a5ff-31ddd9661d81 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1889.353303] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5993e40a-519b-41c1-bcae-4314e25feb20 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1889.367105] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da8479a5-ee88-458e-8896-21508c1027dd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1889.373261] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-650743f3-4e39-434f-83f8-f665c6d69f9f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1889.403182] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181246MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1889.403329] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1889.403522] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1889.473063] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1889.473235] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a83f4609-4c09-4056-a840-cd899af93ea3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1889.473366] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e96a6b8e-75b7-4a2f-a838-107603ad8b80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1889.473488] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2ea05216-40c5-4482-a1d8-278f7ea3d28b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1889.473607] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a6272c75-92de-45a0-8e3e-82e342f0475c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1889.473725] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 73ba3af8-9a29-4c63-9a55-c9879e74239d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1889.473840] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f1940470-82f6-41fb-bd36-96561ad20102 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1889.473956] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 55ca3e89-807f-473c-8b5b-346fc2ea23f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1889.474089] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aa9f1e61-ac26-495c-a698-5163661401a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1889.474236] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 16597080-42c7-40df-9893-38751d9ac11a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1889.474434] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1889.474569] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1889.587625] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae89216a-4080-4653-af08-c1713182187c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1889.595205] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afa173b9-d0de-46ff-b1af-0d7b41128979 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1889.623751] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb3b4b2b-7503-43d4-951e-99f6bbd56c48 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1889.630258] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbfa8b19-78dd-4754-984a-e47387deb817 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1889.642495] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1889.651017] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1889.663664] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1889.663837] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.260s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1890.663585] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1890.663855] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1890.663989] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1894.331513] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1894.331513] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1896.325785] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1896.329357] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1900.330101] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1910.296695] env[60764]: WARNING oslo_vmware.rw_handles [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1910.296695] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1910.296695] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1910.296695] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1910.296695] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1910.296695] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1910.296695] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1910.296695] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1910.296695] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1910.296695] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1910.296695] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1910.296695] env[60764]: ERROR oslo_vmware.rw_handles [ 1910.297407] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/045858d3-0fcf-4d66-bbaf-94d80dbf4fa3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1910.299156] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1910.299433] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Copying Virtual Disk [datastore2] vmware_temp/045858d3-0fcf-4d66-bbaf-94d80dbf4fa3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/045858d3-0fcf-4d66-bbaf-94d80dbf4fa3/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1910.299726] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f5858de5-5866-48ca-b6a4-01ddc9bb65b3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1910.306977] env[60764]: DEBUG oslo_vmware.api [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Waiting for the task: (returnval){ [ 1910.306977] env[60764]: value = "task-2205041" [ 1910.306977] env[60764]: _type = "Task" [ 1910.306977] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1910.316508] env[60764]: DEBUG oslo_vmware.api [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Task: {'id': task-2205041, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1910.817557] env[60764]: DEBUG oslo_vmware.exceptions [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1910.817840] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1910.818417] env[60764]: ERROR nova.compute.manager [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1910.818417] env[60764]: Faults: ['InvalidArgument'] [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Traceback (most recent call last): [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] yield resources [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] self.driver.spawn(context, instance, image_meta, [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] self._fetch_image_if_missing(context, vi) [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] image_cache(vi, tmp_image_ds_loc) [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] vm_util.copy_virtual_disk( [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] session._wait_for_task(vmdk_copy_task) [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] return self.wait_for_task(task_ref) [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] return evt.wait() [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] result = hub.switch() [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] return self.greenlet.switch() [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] self.f(*self.args, **self.kw) [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] raise exceptions.translate_fault(task_info.error) [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Faults: ['InvalidArgument'] [ 1910.818417] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] [ 1910.819569] env[60764]: INFO nova.compute.manager [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Terminating instance [ 1910.820311] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1910.820518] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1910.820755] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-38d4caa4-d4aa-40dc-84dc-f4746ddbf626 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1910.822891] env[60764]: DEBUG nova.compute.manager [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1910.823110] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1910.823821] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73e74e38-2232-458f-9f6b-b5c4ef46f5e0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1910.830245] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1910.830446] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e69e8958-87fb-4662-bd22-3ad0a75dfa30 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1910.832455] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1910.832645] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1910.833572] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ef7c1581-c9af-4b01-b186-dd1bd88d9909 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1910.837943] env[60764]: DEBUG oslo_vmware.api [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Waiting for the task: (returnval){ [ 1910.837943] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52bd1348-0d29-f348-449f-277a411b6b7c" [ 1910.837943] env[60764]: _type = "Task" [ 1910.837943] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1910.844744] env[60764]: DEBUG oslo_vmware.api [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52bd1348-0d29-f348-449f-277a411b6b7c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1910.898010] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1910.898250] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1910.898428] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Deleting the datastore file [datastore2] 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1910.898691] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-02c3bad6-7dfa-41f9-959b-3f735fb8c9e1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1910.904418] env[60764]: DEBUG oslo_vmware.api [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Waiting for the task: (returnval){ [ 1910.904418] env[60764]: value = "task-2205043" [ 1910.904418] env[60764]: _type = "Task" [ 1910.904418] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1910.911534] env[60764]: DEBUG oslo_vmware.api [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Task: {'id': task-2205043, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1911.348051] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1911.348382] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Creating directory with path [datastore2] vmware_temp/4f3109c2-20fb-4827-9213-b8e39e88b335/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1911.348493] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7e606675-f379-40bd-a857-19499890aa3e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.359638] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Created directory with path [datastore2] vmware_temp/4f3109c2-20fb-4827-9213-b8e39e88b335/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1911.359818] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Fetch image to [datastore2] vmware_temp/4f3109c2-20fb-4827-9213-b8e39e88b335/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1911.359982] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/4f3109c2-20fb-4827-9213-b8e39e88b335/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1911.360774] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32a93365-52c1-4ed7-84e8-17c71e1afc0e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.367407] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed7db215-7ace-4980-8956-6ef007213953 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.376065] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f58deb2c-19ef-4ee4-8b6c-0fa4c0b9e857 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.409929] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37658887-d2b8-44e2-8768-bd0313c37d7a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.417053] env[60764]: DEBUG oslo_vmware.api [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Task: {'id': task-2205043, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073668} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1911.418533] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1911.418724] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1911.418893] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1911.419075] env[60764]: INFO nova.compute.manager [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1911.420857] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-005d1b5f-3021-43de-af1c-8d6d19cbce53 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.422833] env[60764]: DEBUG nova.compute.claims [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1911.423033] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1911.423279] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1911.449113] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1911.575139] env[60764]: DEBUG oslo_vmware.rw_handles [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4f3109c2-20fb-4827-9213-b8e39e88b335/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1911.631428] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05050441-bca2-442f-a56b-27298dee0c45 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.636453] env[60764]: DEBUG oslo_vmware.rw_handles [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1911.636626] env[60764]: DEBUG oslo_vmware.rw_handles [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4f3109c2-20fb-4827-9213-b8e39e88b335/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1911.640696] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ae1755a-fdf5-48f4-a6e4-4472f71a0720 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.670977] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2214930c-5eaf-4b81-9bf2-1ef25732e27a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.678285] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf9e8d36-6cac-4bd0-9a66-4c2faaad0625 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1911.690918] env[60764]: DEBUG nova.compute.provider_tree [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1911.700855] env[60764]: DEBUG nova.scheduler.client.report [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1911.715139] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.292s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1911.715709] env[60764]: ERROR nova.compute.manager [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1911.715709] env[60764]: Faults: ['InvalidArgument'] [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Traceback (most recent call last): [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] self.driver.spawn(context, instance, image_meta, [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] self._fetch_image_if_missing(context, vi) [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] image_cache(vi, tmp_image_ds_loc) [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] vm_util.copy_virtual_disk( [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] session._wait_for_task(vmdk_copy_task) [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] return self.wait_for_task(task_ref) [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] return evt.wait() [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] result = hub.switch() [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] return self.greenlet.switch() [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] self.f(*self.args, **self.kw) [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] raise exceptions.translate_fault(task_info.error) [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Faults: ['InvalidArgument'] [ 1911.715709] env[60764]: ERROR nova.compute.manager [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] [ 1911.716963] env[60764]: DEBUG nova.compute.utils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1911.717889] env[60764]: DEBUG nova.compute.manager [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Build of instance 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc was re-scheduled: A specified parameter was not correct: fileType [ 1911.717889] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1911.718273] env[60764]: DEBUG nova.compute.manager [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1911.718443] env[60764]: DEBUG nova.compute.manager [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1911.718608] env[60764]: DEBUG nova.compute.manager [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1911.718772] env[60764]: DEBUG nova.network.neutron [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1912.230345] env[60764]: DEBUG nova.network.neutron [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1912.245610] env[60764]: INFO nova.compute.manager [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Took 0.53 seconds to deallocate network for instance. [ 1912.343583] env[60764]: INFO nova.scheduler.client.report [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Deleted allocations for instance 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc [ 1912.364131] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3c896999-d558-4ddd-b949-d0bcbaa47894 tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Lock "3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 627.163s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1912.364440] env[60764]: DEBUG oslo_concurrency.lockutils [None req-49f2cb84-ed61-4a83-8182-9462026fefba tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Lock "3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 431.696s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1912.364607] env[60764]: DEBUG oslo_concurrency.lockutils [None req-49f2cb84-ed61-4a83-8182-9462026fefba tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Acquiring lock "3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1912.364807] env[60764]: DEBUG oslo_concurrency.lockutils [None req-49f2cb84-ed61-4a83-8182-9462026fefba tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Lock "3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1912.364971] env[60764]: DEBUG oslo_concurrency.lockutils [None req-49f2cb84-ed61-4a83-8182-9462026fefba tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Lock "3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1912.366985] env[60764]: INFO nova.compute.manager [None req-49f2cb84-ed61-4a83-8182-9462026fefba tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Terminating instance [ 1912.368857] env[60764]: DEBUG nova.compute.manager [None req-49f2cb84-ed61-4a83-8182-9462026fefba tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1912.369065] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-49f2cb84-ed61-4a83-8182-9462026fefba tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1912.369525] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-eb12fd35-aa40-41d8-8c93-03d90c3043bc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1912.378731] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36954660-a74a-4a2b-ac56-08dc97332ac6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1912.406195] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-49f2cb84-ed61-4a83-8182-9462026fefba tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc could not be found. [ 1912.406397] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-49f2cb84-ed61-4a83-8182-9462026fefba tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1912.406574] env[60764]: INFO nova.compute.manager [None req-49f2cb84-ed61-4a83-8182-9462026fefba tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1912.406816] env[60764]: DEBUG oslo.service.loopingcall [None req-49f2cb84-ed61-4a83-8182-9462026fefba tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1912.407054] env[60764]: DEBUG nova.compute.manager [-] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1912.407157] env[60764]: DEBUG nova.network.neutron [-] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1912.429064] env[60764]: DEBUG nova.network.neutron [-] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1912.436922] env[60764]: INFO nova.compute.manager [-] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] Took 0.03 seconds to deallocate network for instance. [ 1912.521666] env[60764]: DEBUG oslo_concurrency.lockutils [None req-49f2cb84-ed61-4a83-8182-9462026fefba tempest-ServerActionsTestJSON-1761720785 tempest-ServerActionsTestJSON-1761720785-project-member] Lock "3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.157s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1912.522626] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 209.255s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1912.522838] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc] During sync_power_state the instance has a pending task (deleting). Skip. [ 1912.523060] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "3ccfc78e-f851-4d1c-b1f8-8a6ca6d5e0dc" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1946.195057] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a8a5eaae-89ab-4975-9b6b-4e7e005337e3 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "aa9f1e61-ac26-495c-a698-5163661401a5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1947.330103] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1947.330418] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1947.330418] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1947.350263] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.350417] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.350548] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.350672] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.350797] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.350910] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.351046] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.351170] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.351289] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1947.351404] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1950.329273] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1950.329586] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1950.341194] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1950.341398] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1950.341562] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1950.341715] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1950.342861] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3184187-60f6-4693-af17-37a88a23bb05 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.351735] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ab2892a-ab32-4c38-8999-e3dc32ded69f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.365422] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88e1ca1d-7b25-4624-ba3e-7a9820f1e173 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.371493] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c5b6035-f703-4d01-83fe-085fa6f0b5ce {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.399471] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181245MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1950.399618] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1950.399809] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1950.467637] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a83f4609-4c09-4056-a840-cd899af93ea3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.467799] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance e96a6b8e-75b7-4a2f-a838-107603ad8b80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.467927] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2ea05216-40c5-4482-a1d8-278f7ea3d28b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.468062] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a6272c75-92de-45a0-8e3e-82e342f0475c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.468184] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 73ba3af8-9a29-4c63-9a55-c9879e74239d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.468299] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f1940470-82f6-41fb-bd36-96561ad20102 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.468413] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 55ca3e89-807f-473c-8b5b-346fc2ea23f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.468526] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aa9f1e61-ac26-495c-a698-5163661401a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.468637] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 16597080-42c7-40df-9893-38751d9ac11a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.468821] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1950.468958] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1950.575243] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d41ffc8-7047-421a-95bd-6f06c220b9b9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.584270] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd348968-d450-49ed-ab02-aa6d27710598 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.613535] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd37a711-da05-4e51-a4c7-c3accdbe6f9b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.620737] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a219e38-a910-4566-aaf6-b97524d82b98 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.633389] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1950.641142] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1950.655847] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1950.655847] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.256s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1951.656579] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1952.330477] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1954.326598] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1956.330623] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1956.330972] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1956.331033] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1957.326186] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1959.344376] env[60764]: WARNING oslo_vmware.rw_handles [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1959.344376] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1959.344376] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1959.344376] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1959.344376] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1959.344376] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 1959.344376] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1959.344376] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1959.344376] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1959.344376] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1959.344376] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1959.344376] env[60764]: ERROR oslo_vmware.rw_handles [ 1959.344937] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/4f3109c2-20fb-4827-9213-b8e39e88b335/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1959.346779] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1959.347176] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Copying Virtual Disk [datastore2] vmware_temp/4f3109c2-20fb-4827-9213-b8e39e88b335/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/4f3109c2-20fb-4827-9213-b8e39e88b335/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1959.347499] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ef5238e8-23ca-40bb-943c-62be0b6b5c65 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1959.355206] env[60764]: DEBUG oslo_vmware.api [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Waiting for the task: (returnval){ [ 1959.355206] env[60764]: value = "task-2205044" [ 1959.355206] env[60764]: _type = "Task" [ 1959.355206] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1959.362592] env[60764]: DEBUG oslo_vmware.api [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Task: {'id': task-2205044, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1959.865598] env[60764]: DEBUG oslo_vmware.exceptions [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1959.865913] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1959.866713] env[60764]: ERROR nova.compute.manager [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1959.866713] env[60764]: Faults: ['InvalidArgument'] [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Traceback (most recent call last): [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] yield resources [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] self.driver.spawn(context, instance, image_meta, [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] self._fetch_image_if_missing(context, vi) [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] image_cache(vi, tmp_image_ds_loc) [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] vm_util.copy_virtual_disk( [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] session._wait_for_task(vmdk_copy_task) [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] return self.wait_for_task(task_ref) [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] return evt.wait() [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] result = hub.switch() [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] return self.greenlet.switch() [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] self.f(*self.args, **self.kw) [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] raise exceptions.translate_fault(task_info.error) [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Faults: ['InvalidArgument'] [ 1959.866713] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] [ 1959.867628] env[60764]: INFO nova.compute.manager [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Terminating instance [ 1959.868915] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1959.869149] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1959.870292] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4a5c27b0-d8d8-4d58-9c7b-b931645f0b55 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1959.872513] env[60764]: DEBUG nova.compute.manager [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1959.872705] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1959.873503] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b65671cf-0b12-4c80-af14-2ef68becf475 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1959.881806] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1959.882041] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f61ef347-3485-45ed-8b16-25d9a9e7ecfd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1959.884195] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1959.884363] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1959.885319] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5172547e-ac55-4eb5-bfff-d6d6487f2a90 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1959.890110] env[60764]: DEBUG oslo_vmware.api [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Waiting for the task: (returnval){ [ 1959.890110] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52ddd48f-c6f5-98e7-b925-e677d18f0c22" [ 1959.890110] env[60764]: _type = "Task" [ 1959.890110] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1959.903213] env[60764]: DEBUG oslo_vmware.api [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52ddd48f-c6f5-98e7-b925-e677d18f0c22, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1959.950974] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1959.951210] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1959.951389] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Deleting the datastore file [datastore2] a83f4609-4c09-4056-a840-cd899af93ea3 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1959.951665] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d41f0342-12af-4605-8143-4c73f49135bb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1959.957481] env[60764]: DEBUG oslo_vmware.api [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Waiting for the task: (returnval){ [ 1959.957481] env[60764]: value = "task-2205046" [ 1959.957481] env[60764]: _type = "Task" [ 1959.957481] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1959.965033] env[60764]: DEBUG oslo_vmware.api [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Task: {'id': task-2205046, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1960.400872] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1960.401285] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Creating directory with path [datastore2] vmware_temp/391befe0-56a1-488a-88ee-3622b135298f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1960.401381] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3588200f-5411-4030-9a93-a90c5b169a29 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1960.412178] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Created directory with path [datastore2] vmware_temp/391befe0-56a1-488a-88ee-3622b135298f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1960.412368] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Fetch image to [datastore2] vmware_temp/391befe0-56a1-488a-88ee-3622b135298f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1960.412534] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/391befe0-56a1-488a-88ee-3622b135298f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1960.413265] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2948cf21-495f-4b91-816d-177a7621391d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1960.419398] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-096d54bb-3137-40b0-a1b6-50dc4e5d75b2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1960.427921] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c5d9ee2-52c8-45a2-95a3-ca935684ba65 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1960.457248] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d3d47de-c3a4-4279-a7b4-64b3c4e3eb7a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1960.467496] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0ea61cc2-b335-4e4f-a190-b0c8ca030d31 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1960.469085] env[60764]: DEBUG oslo_vmware.api [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Task: {'id': task-2205046, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072372} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1960.469314] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1960.469485] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1960.469652] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1960.469821] env[60764]: INFO nova.compute.manager [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1960.471874] env[60764]: DEBUG nova.compute.claims [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1960.472058] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1960.472278] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1960.494506] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1960.548579] env[60764]: DEBUG oslo_vmware.rw_handles [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/391befe0-56a1-488a-88ee-3622b135298f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1960.608108] env[60764]: DEBUG oslo_vmware.rw_handles [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1960.608293] env[60764]: DEBUG oslo_vmware.rw_handles [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/391befe0-56a1-488a-88ee-3622b135298f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1960.688976] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99f1cbef-d4a8-4572-9e98-ca8c4846dcc7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1960.698031] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44125725-da45-48c0-98f5-61a990bed69d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1960.728157] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8244c9fa-5715-42be-be6d-cdec20d03b8f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1960.734838] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed57c393-bfce-437b-844e-b11654ebda3f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1960.747465] env[60764]: DEBUG nova.compute.provider_tree [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1960.755808] env[60764]: DEBUG nova.scheduler.client.report [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1960.769238] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.297s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1960.769745] env[60764]: ERROR nova.compute.manager [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1960.769745] env[60764]: Faults: ['InvalidArgument'] [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Traceback (most recent call last): [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] self.driver.spawn(context, instance, image_meta, [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] self._fetch_image_if_missing(context, vi) [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] image_cache(vi, tmp_image_ds_loc) [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] vm_util.copy_virtual_disk( [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] session._wait_for_task(vmdk_copy_task) [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] return self.wait_for_task(task_ref) [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] return evt.wait() [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] result = hub.switch() [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] return self.greenlet.switch() [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] self.f(*self.args, **self.kw) [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] raise exceptions.translate_fault(task_info.error) [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Faults: ['InvalidArgument'] [ 1960.769745] env[60764]: ERROR nova.compute.manager [instance: a83f4609-4c09-4056-a840-cd899af93ea3] [ 1960.771066] env[60764]: DEBUG nova.compute.utils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1960.771731] env[60764]: DEBUG nova.compute.manager [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Build of instance a83f4609-4c09-4056-a840-cd899af93ea3 was re-scheduled: A specified parameter was not correct: fileType [ 1960.771731] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1960.772108] env[60764]: DEBUG nova.compute.manager [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1960.772278] env[60764]: DEBUG nova.compute.manager [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1960.772444] env[60764]: DEBUG nova.compute.manager [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1960.772605] env[60764]: DEBUG nova.network.neutron [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1961.112722] env[60764]: DEBUG nova.network.neutron [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1961.128021] env[60764]: INFO nova.compute.manager [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Took 0.35 seconds to deallocate network for instance. [ 1961.220028] env[60764]: INFO nova.scheduler.client.report [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Deleted allocations for instance a83f4609-4c09-4056-a840-cd899af93ea3 [ 1961.242677] env[60764]: DEBUG oslo_concurrency.lockutils [None req-40251e6d-bbb6-4397-8ccc-51d96ee68f33 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Lock "a83f4609-4c09-4056-a840-cd899af93ea3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 605.209s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1961.242943] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8e43acd1-363d-4b2a-9fb6-0bbf318e7330 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Lock "a83f4609-4c09-4056-a840-cd899af93ea3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 409.340s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1961.243179] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8e43acd1-363d-4b2a-9fb6-0bbf318e7330 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Acquiring lock "a83f4609-4c09-4056-a840-cd899af93ea3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1961.243452] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8e43acd1-363d-4b2a-9fb6-0bbf318e7330 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Lock "a83f4609-4c09-4056-a840-cd899af93ea3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1961.243660] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8e43acd1-363d-4b2a-9fb6-0bbf318e7330 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Lock "a83f4609-4c09-4056-a840-cd899af93ea3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1961.246770] env[60764]: INFO nova.compute.manager [None req-8e43acd1-363d-4b2a-9fb6-0bbf318e7330 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Terminating instance [ 1961.248682] env[60764]: DEBUG nova.compute.manager [None req-8e43acd1-363d-4b2a-9fb6-0bbf318e7330 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1961.249241] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-8e43acd1-363d-4b2a-9fb6-0bbf318e7330 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1961.249241] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2e7b4f07-5baa-4768-b6f7-54b76b29d66b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1961.258476] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecfc13e0-9ab0-4b15-9139-f785e757e388 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1961.287629] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-8e43acd1-363d-4b2a-9fb6-0bbf318e7330 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a83f4609-4c09-4056-a840-cd899af93ea3 could not be found. [ 1961.287833] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-8e43acd1-363d-4b2a-9fb6-0bbf318e7330 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1961.288012] env[60764]: INFO nova.compute.manager [None req-8e43acd1-363d-4b2a-9fb6-0bbf318e7330 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1961.288282] env[60764]: DEBUG oslo.service.loopingcall [None req-8e43acd1-363d-4b2a-9fb6-0bbf318e7330 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1961.288519] env[60764]: DEBUG nova.compute.manager [-] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1961.288621] env[60764]: DEBUG nova.network.neutron [-] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1961.327516] env[60764]: DEBUG nova.network.neutron [-] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1961.336144] env[60764]: INFO nova.compute.manager [-] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] Took 0.05 seconds to deallocate network for instance. [ 1961.457260] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8e43acd1-363d-4b2a-9fb6-0bbf318e7330 tempest-AttachVolumeTestJSON-507920059 tempest-AttachVolumeTestJSON-507920059-project-member] Lock "a83f4609-4c09-4056-a840-cd899af93ea3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.214s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1961.458047] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "a83f4609-4c09-4056-a840-cd899af93ea3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 258.190s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1961.458240] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a83f4609-4c09-4056-a840-cd899af93ea3] During sync_power_state the instance has a pending task (deleting). Skip. [ 1961.458412] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "a83f4609-4c09-4056-a840-cd899af93ea3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1962.330615] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1989.812296] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquiring lock "b12b4099-9dd1-4219-823d-79cdef6a4e5e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1989.812578] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Lock "b12b4099-9dd1-4219-823d-79cdef6a4e5e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1989.823471] env[60764]: DEBUG nova.compute.manager [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1989.873678] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1989.873923] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1989.875368] env[60764]: INFO nova.compute.claims [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1989.985641] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquiring lock "c6fbe481-a9e0-40d5-9cac-e4645a50be1a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1989.985872] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Lock "c6fbe481-a9e0-40d5-9cac-e4645a50be1a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1989.995601] env[60764]: DEBUG nova.compute.manager [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1990.038390] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1990.042771] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6b640be-a0c5-4280-b60e-2766afbe85b8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.050692] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce152ab7-bcc6-4080-ace7-3cfe2048ac4e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.079848] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d549e78-b1c6-4ece-a8ac-181d16aee0d4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.086360] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9aaeac10-807d-491d-9c60-c41eef595acc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.098666] env[60764]: DEBUG nova.compute.provider_tree [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1990.106974] env[60764]: DEBUG nova.scheduler.client.report [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1990.120852] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.247s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1990.121271] env[60764]: DEBUG nova.compute.manager [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1990.123384] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.085s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1990.124693] env[60764]: INFO nova.compute.claims [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1990.153242] env[60764]: DEBUG nova.compute.utils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1990.154812] env[60764]: DEBUG nova.compute.manager [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Not allocating networking since 'none' was specified. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1990.161659] env[60764]: DEBUG nova.compute.manager [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1990.226239] env[60764]: DEBUG nova.compute.manager [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1990.254284] env[60764]: DEBUG nova.virt.hardware [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1990.254585] env[60764]: DEBUG nova.virt.hardware [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1990.254694] env[60764]: DEBUG nova.virt.hardware [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1990.254872] env[60764]: DEBUG nova.virt.hardware [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1990.255076] env[60764]: DEBUG nova.virt.hardware [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1990.255261] env[60764]: DEBUG nova.virt.hardware [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1990.255469] env[60764]: DEBUG nova.virt.hardware [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1990.255623] env[60764]: DEBUG nova.virt.hardware [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1990.255786] env[60764]: DEBUG nova.virt.hardware [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1990.255944] env[60764]: DEBUG nova.virt.hardware [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1990.256227] env[60764]: DEBUG nova.virt.hardware [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1990.257087] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efb46ac7-2265-425c-a949-5efc204f7a40 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.267123] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef70bfc4-3238-43f7-93ca-e7ea708d9a4c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.283269] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Instance VIF info [] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1990.289454] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Creating folder: Project (77623e492b2642cea392e79d51e32fc2). Parent ref: group-v449629. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1990.291864] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dde1b981-f8d9-49aa-a943-3a5d08b027d5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.301212] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Created folder: Project (77623e492b2642cea392e79d51e32fc2) in parent group-v449629. [ 1990.301397] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Creating folder: Instances. Parent ref: group-v449742. {{(pid=60764) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1990.301613] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-de689367-1557-4bef-b051-cebce0421dc6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.311411] env[60764]: INFO nova.virt.vmwareapi.vm_util [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Created folder: Instances in parent group-v449742. [ 1990.311629] env[60764]: DEBUG oslo.service.loopingcall [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1990.311818] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1990.312020] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fb4024e5-0588-40c8-8c5f-25092ccd1aaa {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.325052] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6c3df03-84bd-4f3d-a0a3-0442999211b0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.332484] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-901c1a49-b84c-49c1-b88a-a84c06480f41 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.335066] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1990.335066] env[60764]: value = "task-2205049" [ 1990.335066] env[60764]: _type = "Task" [ 1990.335066] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1990.364458] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-779b514c-9c5c-4a25-975f-e6520630bd7d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.368952] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205049, 'name': CreateVM_Task} progress is 15%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1990.373556] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2c73e46-4be3-4e85-a72c-9d274c0d04f0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.386544] env[60764]: DEBUG nova.compute.provider_tree [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1990.395713] env[60764]: DEBUG nova.scheduler.client.report [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1990.412637] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.289s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1990.413104] env[60764]: DEBUG nova.compute.manager [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1990.445933] env[60764]: DEBUG nova.compute.utils [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1990.449039] env[60764]: DEBUG nova.compute.manager [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Not allocating networking since 'none' was specified. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1990.460638] env[60764]: DEBUG nova.compute.manager [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1990.525594] env[60764]: DEBUG nova.compute.manager [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1990.547827] env[60764]: DEBUG nova.virt.hardware [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1990.548080] env[60764]: DEBUG nova.virt.hardware [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1990.548247] env[60764]: DEBUG nova.virt.hardware [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1990.548425] env[60764]: DEBUG nova.virt.hardware [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1990.548569] env[60764]: DEBUG nova.virt.hardware [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1990.548760] env[60764]: DEBUG nova.virt.hardware [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1990.548922] env[60764]: DEBUG nova.virt.hardware [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1990.549098] env[60764]: DEBUG nova.virt.hardware [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1990.549269] env[60764]: DEBUG nova.virt.hardware [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1990.549428] env[60764]: DEBUG nova.virt.hardware [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1990.549596] env[60764]: DEBUG nova.virt.hardware [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1990.550465] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a53625d0-fba0-4bde-895a-345951b7371e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.558012] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1563369-6e02-4506-8606-a133151e724f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.571795] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Instance VIF info [] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1990.577476] env[60764]: DEBUG oslo.service.loopingcall [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1990.577722] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1990.577931] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9dbecc9c-3f45-4b2e-b541-79afc65e8eda {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.593015] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1990.593015] env[60764]: value = "task-2205050" [ 1990.593015] env[60764]: _type = "Task" [ 1990.593015] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1990.601173] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205050, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1990.845188] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205049, 'name': CreateVM_Task, 'duration_secs': 0.25605} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1990.845556] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1990.845932] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1990.846163] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1990.846447] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1990.846709] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-55227487-b827-48e4-a790-e9d76e497a00 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1990.851065] env[60764]: DEBUG oslo_vmware.api [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Waiting for the task: (returnval){ [ 1990.851065] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52ab688d-28fa-abdd-9084-764efa5f64b9" [ 1990.851065] env[60764]: _type = "Task" [ 1990.851065] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1990.858287] env[60764]: DEBUG oslo_vmware.api [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52ab688d-28fa-abdd-9084-764efa5f64b9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1991.102798] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205050, 'name': CreateVM_Task, 'duration_secs': 0.265928} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1991.102964] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1991.103422] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1991.361168] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1991.361367] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1991.361583] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1991.361794] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1991.362112] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1991.362367] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3b7d9cd5-6dd6-4cfc-bd04-cd0bcaef11ff {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1991.366598] env[60764]: DEBUG oslo_vmware.api [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Waiting for the task: (returnval){ [ 1991.366598] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52f066a3-eb77-06ef-7b48-2c0cdf81485c" [ 1991.366598] env[60764]: _type = "Task" [ 1991.366598] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1991.374350] env[60764]: DEBUG oslo_vmware.api [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52f066a3-eb77-06ef-7b48-2c0cdf81485c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1991.877820] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1991.877820] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1991.877820] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2007.330989] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2007.331325] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Cleaning up deleted instances {{(pid=60764) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2007.341799] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] There are 0 instances to clean {{(pid=60764) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2008.826011] env[60764]: WARNING oslo_vmware.rw_handles [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2008.826011] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2008.826011] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2008.826011] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2008.826011] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2008.826011] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 2008.826011] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2008.826011] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2008.826011] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2008.826011] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2008.826011] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2008.826011] env[60764]: ERROR oslo_vmware.rw_handles [ 2008.826665] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/391befe0-56a1-488a-88ee-3622b135298f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2008.828448] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2008.828680] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Copying Virtual Disk [datastore2] vmware_temp/391befe0-56a1-488a-88ee-3622b135298f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/391befe0-56a1-488a-88ee-3622b135298f/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2008.828957] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d87a2a3d-596a-43be-b788-df3fd00221bf {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2008.836316] env[60764]: DEBUG oslo_vmware.api [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Waiting for the task: (returnval){ [ 2008.836316] env[60764]: value = "task-2205051" [ 2008.836316] env[60764]: _type = "Task" [ 2008.836316] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2008.844414] env[60764]: DEBUG oslo_vmware.api [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Task: {'id': task-2205051, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2009.341013] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2009.341197] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2009.341319] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2009.348323] env[60764]: DEBUG oslo_vmware.exceptions [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2009.348600] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2009.349167] env[60764]: ERROR nova.compute.manager [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2009.349167] env[60764]: Faults: ['InvalidArgument'] [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Traceback (most recent call last): [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] yield resources [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] self.driver.spawn(context, instance, image_meta, [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] self._fetch_image_if_missing(context, vi) [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] image_cache(vi, tmp_image_ds_loc) [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] vm_util.copy_virtual_disk( [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] session._wait_for_task(vmdk_copy_task) [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] return self.wait_for_task(task_ref) [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] return evt.wait() [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] result = hub.switch() [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] return self.greenlet.switch() [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] self.f(*self.args, **self.kw) [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] raise exceptions.translate_fault(task_info.error) [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Faults: ['InvalidArgument'] [ 2009.349167] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] [ 2009.349986] env[60764]: INFO nova.compute.manager [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Terminating instance [ 2009.350857] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2009.351078] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2009.351317] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-deb74802-dafc-4615-b1d9-a03c9188b26c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.353473] env[60764]: DEBUG nova.compute.manager [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2009.353673] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2009.354419] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e93a9bb2-e06a-4182-be3b-980553793730 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.361084] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2009.361992] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-41243107-6a05-49c9-8cd2-28163f18d8be {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.365037] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2009.365190] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2009.365320] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2009.365444] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2009.365564] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2009.365721] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2009.365848] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2009.365955] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2009.366084] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2009.366200] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2009.366312] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2009.367663] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2009.367831] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2009.368837] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-693b356a-d37a-4309-acca-ebcf537306f1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.373790] env[60764]: DEBUG oslo_vmware.api [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Waiting for the task: (returnval){ [ 2009.373790] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]524cec3d-1205-456f-4be1-2e8f3690b33d" [ 2009.373790] env[60764]: _type = "Task" [ 2009.373790] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2009.380933] env[60764]: DEBUG oslo_vmware.api [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]524cec3d-1205-456f-4be1-2e8f3690b33d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2009.433541] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2009.433792] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2009.433973] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Deleting the datastore file [datastore2] e96a6b8e-75b7-4a2f-a838-107603ad8b80 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2009.434629] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cba6449b-4f5b-4548-bcf2-c94b6ab7d6ea {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.441214] env[60764]: DEBUG oslo_vmware.api [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Waiting for the task: (returnval){ [ 2009.441214] env[60764]: value = "task-2205053" [ 2009.441214] env[60764]: _type = "Task" [ 2009.441214] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2009.448518] env[60764]: DEBUG oslo_vmware.api [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Task: {'id': task-2205053, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2009.884954] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2009.885326] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Creating directory with path [datastore2] vmware_temp/d4a60293-b736-478c-8765-3a436a21ed05/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2009.885397] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ba02c98e-47ea-472c-a337-9f30a9823dd1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.895870] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Created directory with path [datastore2] vmware_temp/d4a60293-b736-478c-8765-3a436a21ed05/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2009.896067] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Fetch image to [datastore2] vmware_temp/d4a60293-b736-478c-8765-3a436a21ed05/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2009.896240] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/d4a60293-b736-478c-8765-3a436a21ed05/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2009.896955] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f665cfcf-0ef8-4e30-ac66-95fcb01fa606 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.903501] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb6df96a-1725-4fa6-99d9-a598656f79a8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.912522] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45cc58f5-36ff-4ec6-81bd-111c22f9c658 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.946917] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64083056-1c36-4941-838c-e67a5c6fd367 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.955056] env[60764]: DEBUG oslo_vmware.api [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Task: {'id': task-2205053, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063413} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2009.955534] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2009.955722] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2009.955888] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2009.956072] env[60764]: INFO nova.compute.manager [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2009.957566] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-934b312a-ec7b-4e7f-b842-1ff38ac39ec3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2009.959570] env[60764]: DEBUG nova.compute.claims [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2009.959745] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2009.959956] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2009.981688] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2010.076369] env[60764]: DEBUG nova.scheduler.client.report [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Refreshing inventories for resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2010.091314] env[60764]: DEBUG nova.scheduler.client.report [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Updating ProviderTree inventory for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2010.091583] env[60764]: DEBUG nova.compute.provider_tree [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Updating inventory in ProviderTree for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2010.102500] env[60764]: DEBUG nova.scheduler.client.report [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Refreshing aggregate associations for resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4, aggregates: None {{(pid=60764) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2010.122076] env[60764]: DEBUG nova.scheduler.client.report [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Refreshing trait associations for resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_VMDK {{(pid=60764) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2010.129455] env[60764]: DEBUG oslo_vmware.rw_handles [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d4a60293-b736-478c-8765-3a436a21ed05/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2010.188909] env[60764]: DEBUG oslo_vmware.rw_handles [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2010.189101] env[60764]: DEBUG oslo_vmware.rw_handles [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d4a60293-b736-478c-8765-3a436a21ed05/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2010.282826] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ccd7869-bc04-43f1-88e4-8afa288756f7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.290535] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c6b0e06-3f91-4583-97e6-027594362fd5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.320098] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1363086c-906c-4d90-b181-b35034db3ab9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.326696] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-590a94a5-1718-4ff0-8b86-e069c2c645e8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.330455] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2010.339647] env[60764]: DEBUG nova.compute.provider_tree [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2010.348204] env[60764]: DEBUG nova.scheduler.client.report [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2010.361954] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.402s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2010.362481] env[60764]: ERROR nova.compute.manager [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2010.362481] env[60764]: Faults: ['InvalidArgument'] [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Traceback (most recent call last): [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] self.driver.spawn(context, instance, image_meta, [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] self._fetch_image_if_missing(context, vi) [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] image_cache(vi, tmp_image_ds_loc) [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] vm_util.copy_virtual_disk( [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] session._wait_for_task(vmdk_copy_task) [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] return self.wait_for_task(task_ref) [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] return evt.wait() [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] result = hub.switch() [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] return self.greenlet.switch() [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] self.f(*self.args, **self.kw) [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] raise exceptions.translate_fault(task_info.error) [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Faults: ['InvalidArgument'] [ 2010.362481] env[60764]: ERROR nova.compute.manager [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] [ 2010.363352] env[60764]: DEBUG nova.compute.utils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2010.364535] env[60764]: DEBUG nova.compute.manager [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Build of instance e96a6b8e-75b7-4a2f-a838-107603ad8b80 was re-scheduled: A specified parameter was not correct: fileType [ 2010.364535] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2010.364904] env[60764]: DEBUG nova.compute.manager [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2010.365088] env[60764]: DEBUG nova.compute.manager [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2010.365259] env[60764]: DEBUG nova.compute.manager [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2010.365417] env[60764]: DEBUG nova.network.neutron [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2010.717817] env[60764]: DEBUG nova.network.neutron [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2010.731335] env[60764]: INFO nova.compute.manager [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Took 0.37 seconds to deallocate network for instance. [ 2010.825148] env[60764]: INFO nova.scheduler.client.report [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Deleted allocations for instance e96a6b8e-75b7-4a2f-a838-107603ad8b80 [ 2010.846023] env[60764]: DEBUG oslo_concurrency.lockutils [None req-888964ce-0173-406f-abdb-24aedde36ed0 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "e96a6b8e-75b7-4a2f-a838-107603ad8b80" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 654.126s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2010.846023] env[60764]: DEBUG oslo_concurrency.lockutils [None req-710b6699-635f-48e1-b5b5-191c8a34dba1 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "e96a6b8e-75b7-4a2f-a838-107603ad8b80" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 457.483s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2010.846205] env[60764]: DEBUG oslo_concurrency.lockutils [None req-710b6699-635f-48e1-b5b5-191c8a34dba1 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "e96a6b8e-75b7-4a2f-a838-107603ad8b80-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2010.846349] env[60764]: DEBUG oslo_concurrency.lockutils [None req-710b6699-635f-48e1-b5b5-191c8a34dba1 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "e96a6b8e-75b7-4a2f-a838-107603ad8b80-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2010.846524] env[60764]: DEBUG oslo_concurrency.lockutils [None req-710b6699-635f-48e1-b5b5-191c8a34dba1 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "e96a6b8e-75b7-4a2f-a838-107603ad8b80-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2010.849130] env[60764]: INFO nova.compute.manager [None req-710b6699-635f-48e1-b5b5-191c8a34dba1 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Terminating instance [ 2010.851434] env[60764]: DEBUG nova.compute.manager [None req-710b6699-635f-48e1-b5b5-191c8a34dba1 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2010.851775] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-710b6699-635f-48e1-b5b5-191c8a34dba1 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2010.852205] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-649f8806-435b-422a-a2b0-31171fc1188d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.861626] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbe62842-3b23-4ffd-898b-c33faa5d712e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.889360] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-710b6699-635f-48e1-b5b5-191c8a34dba1 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e96a6b8e-75b7-4a2f-a838-107603ad8b80 could not be found. [ 2010.889614] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-710b6699-635f-48e1-b5b5-191c8a34dba1 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2010.889731] env[60764]: INFO nova.compute.manager [None req-710b6699-635f-48e1-b5b5-191c8a34dba1 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2010.889975] env[60764]: DEBUG oslo.service.loopingcall [None req-710b6699-635f-48e1-b5b5-191c8a34dba1 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2010.890214] env[60764]: DEBUG nova.compute.manager [-] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2010.890313] env[60764]: DEBUG nova.network.neutron [-] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2010.911110] env[60764]: DEBUG nova.network.neutron [-] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2010.918740] env[60764]: INFO nova.compute.manager [-] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] Took 0.03 seconds to deallocate network for instance. [ 2011.000986] env[60764]: DEBUG oslo_concurrency.lockutils [None req-710b6699-635f-48e1-b5b5-191c8a34dba1 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "e96a6b8e-75b7-4a2f-a838-107603ad8b80" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.155s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2011.001768] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "e96a6b8e-75b7-4a2f-a838-107603ad8b80" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 307.734s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2011.001954] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: e96a6b8e-75b7-4a2f-a838-107603ad8b80] During sync_power_state the instance has a pending task (deleting). Skip. [ 2011.002142] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "e96a6b8e-75b7-4a2f-a838-107603ad8b80" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2012.330205] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2012.340918] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2012.341172] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2012.341343] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2012.341500] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2012.343019] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b68f782-fa88-43bc-8f67-fb42a6a185f1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.351406] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4622fc76-1d52-4dfe-9e62-26a32cb30582 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.364918] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09520a63-9419-44be-9c4a-28ad81e45d20 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.370938] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24332917-8725-4b49-8d8a-364e2a9ecb6f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.400428] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181275MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2012.400594] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2012.400748] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2012.467195] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 2ea05216-40c5-4482-a1d8-278f7ea3d28b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2012.467356] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a6272c75-92de-45a0-8e3e-82e342f0475c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2012.467480] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 73ba3af8-9a29-4c63-9a55-c9879e74239d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2012.467601] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f1940470-82f6-41fb-bd36-96561ad20102 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2012.467718] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 55ca3e89-807f-473c-8b5b-346fc2ea23f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2012.467834] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aa9f1e61-ac26-495c-a698-5163661401a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2012.467949] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 16597080-42c7-40df-9893-38751d9ac11a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2012.468077] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b12b4099-9dd1-4219-823d-79cdef6a4e5e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2012.468197] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c6fbe481-a9e0-40d5-9cac-e4645a50be1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2012.468374] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2012.468505] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2012.574625] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82626ef5-c39b-4633-9a0b-b3ef77964df6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.582344] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51984e3a-468c-47e5-afbb-636abc225dc8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.611067] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37eb9a9c-16a2-4c6a-bb0c-6d189d875d38 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.617498] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aebbf28a-afda-4db7-be43-e2f3862aab1a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.631023] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2012.639104] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2012.657892] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2012.658159] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.257s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2013.658345] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2013.658713] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2014.329497] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2014.329710] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Cleaning up deleted instances with incomplete migration {{(pid=60764) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2015.330642] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2017.338120] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2017.338466] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2018.330627] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2019.325760] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2022.330617] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2058.845715] env[60764]: WARNING oslo_vmware.rw_handles [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2058.845715] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2058.845715] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2058.845715] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2058.845715] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2058.845715] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 2058.845715] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2058.845715] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2058.845715] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2058.845715] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2058.845715] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2058.845715] env[60764]: ERROR oslo_vmware.rw_handles [ 2058.846410] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/d4a60293-b736-478c-8765-3a436a21ed05/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2058.848450] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2058.848707] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Copying Virtual Disk [datastore2] vmware_temp/d4a60293-b736-478c-8765-3a436a21ed05/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/d4a60293-b736-478c-8765-3a436a21ed05/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2058.849017] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-aee903f0-5fed-4f10-ad36-b2c3c5146b4e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2058.858446] env[60764]: DEBUG oslo_vmware.api [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Waiting for the task: (returnval){ [ 2058.858446] env[60764]: value = "task-2205054" [ 2058.858446] env[60764]: _type = "Task" [ 2058.858446] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2058.866236] env[60764]: DEBUG oslo_vmware.api [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Task: {'id': task-2205054, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2059.368118] env[60764]: DEBUG oslo_vmware.exceptions [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2059.368432] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2059.369018] env[60764]: ERROR nova.compute.manager [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2059.369018] env[60764]: Faults: ['InvalidArgument'] [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Traceback (most recent call last): [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] yield resources [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] self.driver.spawn(context, instance, image_meta, [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] self._fetch_image_if_missing(context, vi) [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] image_cache(vi, tmp_image_ds_loc) [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] vm_util.copy_virtual_disk( [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] session._wait_for_task(vmdk_copy_task) [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] return self.wait_for_task(task_ref) [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] return evt.wait() [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] result = hub.switch() [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] return self.greenlet.switch() [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] self.f(*self.args, **self.kw) [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] raise exceptions.translate_fault(task_info.error) [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Faults: ['InvalidArgument'] [ 2059.369018] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] [ 2059.369879] env[60764]: INFO nova.compute.manager [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Terminating instance [ 2059.370855] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2059.371078] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2059.371320] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2026979e-54f2-44a1-a53a-f544867e01bd {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.373528] env[60764]: DEBUG nova.compute.manager [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2059.373942] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2059.374441] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ebaa184-99ef-4dd0-bec2-23ce8740da8d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.381323] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2059.381540] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-041f6c7e-9a0a-46ce-891d-adf294f5bcc3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.387070] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2059.387258] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2059.387938] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a7993d1f-6f56-4745-b97f-4a57ffec974e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.392391] env[60764]: DEBUG oslo_vmware.api [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for the task: (returnval){ [ 2059.392391] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52d258f3-e700-a803-de6e-242cfdd45c56" [ 2059.392391] env[60764]: _type = "Task" [ 2059.392391] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2059.798148] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2059.798372] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2059.798555] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Deleting the datastore file [datastore2] 2ea05216-40c5-4482-a1d8-278f7ea3d28b {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2059.798821] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6f7e4ec2-63e2-4d0c-a5fd-94bf2f94313c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.805143] env[60764]: DEBUG oslo_vmware.api [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Waiting for the task: (returnval){ [ 2059.805143] env[60764]: value = "task-2205056" [ 2059.805143] env[60764]: _type = "Task" [ 2059.805143] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2059.813796] env[60764]: DEBUG oslo_vmware.api [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Task: {'id': task-2205056, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2059.902024] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2059.902377] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Creating directory with path [datastore2] vmware_temp/b0da3c7f-f379-402d-a1da-6c53bee81f9b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2059.902546] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3c884554-8b05-40fb-8d44-f14c40a7a7c1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.913254] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Created directory with path [datastore2] vmware_temp/b0da3c7f-f379-402d-a1da-6c53bee81f9b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2059.913448] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Fetch image to [datastore2] vmware_temp/b0da3c7f-f379-402d-a1da-6c53bee81f9b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2059.913617] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/b0da3c7f-f379-402d-a1da-6c53bee81f9b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2059.914363] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d7ed413-4c8f-4cd7-8ce3-cd7487b88785 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.920824] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-238d7aa5-ac87-4892-b0fb-aaf9521ca000 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.929592] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8553da7-d03c-49b8-959b-b4e89bda3c65 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.960045] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eefc496c-eb99-41f3-8ed7-4118515a543e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.965546] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6f698e3b-b726-43e8-8b1e-89d88b3481ff {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2059.987879] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2060.036754] env[60764]: DEBUG oslo_vmware.rw_handles [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b0da3c7f-f379-402d-a1da-6c53bee81f9b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2060.095641] env[60764]: DEBUG oslo_vmware.rw_handles [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2060.095823] env[60764]: DEBUG oslo_vmware.rw_handles [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b0da3c7f-f379-402d-a1da-6c53bee81f9b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2060.315653] env[60764]: DEBUG oslo_vmware.api [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Task: {'id': task-2205056, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082267} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2060.315900] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2060.316047] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2060.316237] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2060.316398] env[60764]: INFO nova.compute.manager [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Took 0.94 seconds to destroy the instance on the hypervisor. [ 2060.318404] env[60764]: DEBUG nova.compute.claims [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2060.318571] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2060.318794] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2060.476343] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c978485a-0ec6-40a6-a1f9-3e0d5c5a2091 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2060.483735] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f9cc2b7-423e-4163-89f6-45daa7d89448 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2060.516149] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d742e56-9fbc-4d10-bd10-69edc9ba9649 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2060.523517] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84ecf9fd-7602-46d2-8761-4559432ecd59 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2060.536399] env[60764]: DEBUG nova.compute.provider_tree [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2060.545349] env[60764]: DEBUG nova.scheduler.client.report [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2060.561059] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.242s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2060.561617] env[60764]: ERROR nova.compute.manager [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2060.561617] env[60764]: Faults: ['InvalidArgument'] [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Traceback (most recent call last): [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] self.driver.spawn(context, instance, image_meta, [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] self._fetch_image_if_missing(context, vi) [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] image_cache(vi, tmp_image_ds_loc) [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] vm_util.copy_virtual_disk( [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] session._wait_for_task(vmdk_copy_task) [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] return self.wait_for_task(task_ref) [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] return evt.wait() [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] result = hub.switch() [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] return self.greenlet.switch() [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] self.f(*self.args, **self.kw) [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] raise exceptions.translate_fault(task_info.error) [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Faults: ['InvalidArgument'] [ 2060.561617] env[60764]: ERROR nova.compute.manager [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] [ 2060.562519] env[60764]: DEBUG nova.compute.utils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2060.563698] env[60764]: DEBUG nova.compute.manager [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Build of instance 2ea05216-40c5-4482-a1d8-278f7ea3d28b was re-scheduled: A specified parameter was not correct: fileType [ 2060.563698] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2060.564080] env[60764]: DEBUG nova.compute.manager [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2060.564258] env[60764]: DEBUG nova.compute.manager [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2060.564427] env[60764]: DEBUG nova.compute.manager [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2060.564587] env[60764]: DEBUG nova.network.neutron [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2060.871827] env[60764]: DEBUG nova.network.neutron [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2060.882886] env[60764]: INFO nova.compute.manager [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Took 0.32 seconds to deallocate network for instance. [ 2060.970720] env[60764]: INFO nova.scheduler.client.report [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Deleted allocations for instance 2ea05216-40c5-4482-a1d8-278f7ea3d28b [ 2060.995830] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2610ed8c-b798-486b-8aa2-e31d8b9157dc tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Lock "2ea05216-40c5-4482-a1d8-278f7ea3d28b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 675.995s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2060.996119] env[60764]: DEBUG oslo_concurrency.lockutils [None req-29631b00-0781-404e-8993-5f1a167b65e9 tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Lock "2ea05216-40c5-4482-a1d8-278f7ea3d28b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 480.167s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2060.996346] env[60764]: DEBUG oslo_concurrency.lockutils [None req-29631b00-0781-404e-8993-5f1a167b65e9 tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Acquiring lock "2ea05216-40c5-4482-a1d8-278f7ea3d28b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2060.996565] env[60764]: DEBUG oslo_concurrency.lockutils [None req-29631b00-0781-404e-8993-5f1a167b65e9 tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Lock "2ea05216-40c5-4482-a1d8-278f7ea3d28b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2060.996734] env[60764]: DEBUG oslo_concurrency.lockutils [None req-29631b00-0781-404e-8993-5f1a167b65e9 tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Lock "2ea05216-40c5-4482-a1d8-278f7ea3d28b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2060.998710] env[60764]: INFO nova.compute.manager [None req-29631b00-0781-404e-8993-5f1a167b65e9 tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Terminating instance [ 2061.000408] env[60764]: DEBUG nova.compute.manager [None req-29631b00-0781-404e-8993-5f1a167b65e9 tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2061.000605] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-29631b00-0781-404e-8993-5f1a167b65e9 tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2061.001084] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-82306c73-9a19-4bbd-b050-7ec7dc7939c4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2061.010013] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4764e877-de42-480b-903a-33652c4ce477 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2061.037089] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-29631b00-0781-404e-8993-5f1a167b65e9 tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2ea05216-40c5-4482-a1d8-278f7ea3d28b could not be found. [ 2061.037312] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-29631b00-0781-404e-8993-5f1a167b65e9 tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2061.037490] env[60764]: INFO nova.compute.manager [None req-29631b00-0781-404e-8993-5f1a167b65e9 tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2061.037725] env[60764]: DEBUG oslo.service.loopingcall [None req-29631b00-0781-404e-8993-5f1a167b65e9 tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2061.037946] env[60764]: DEBUG nova.compute.manager [-] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2061.038045] env[60764]: DEBUG nova.network.neutron [-] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2061.060794] env[60764]: DEBUG nova.network.neutron [-] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2061.068847] env[60764]: INFO nova.compute.manager [-] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] Took 0.03 seconds to deallocate network for instance. [ 2061.154181] env[60764]: DEBUG oslo_concurrency.lockutils [None req-29631b00-0781-404e-8993-5f1a167b65e9 tempest-ServerActionsTestOtherB-1597704808 tempest-ServerActionsTestOtherB-1597704808-project-member] Lock "2ea05216-40c5-4482-a1d8-278f7ea3d28b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.158s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2061.155390] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "2ea05216-40c5-4482-a1d8-278f7ea3d28b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 357.887s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2061.155592] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 2ea05216-40c5-4482-a1d8-278f7ea3d28b] During sync_power_state the instance has a pending task (deleting). Skip. [ 2061.155764] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "2ea05216-40c5-4482-a1d8-278f7ea3d28b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2069.330895] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2069.331274] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2069.331274] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2069.350177] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2069.350333] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2069.350464] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2069.350590] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2069.350711] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2069.350831] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2069.350950] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2069.351221] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2069.351368] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2070.589461] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8b3e1525-8817-4e8f-ac39-c162ef3891ff tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "16597080-42c7-40df-9893-38751d9ac11a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2071.330499] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2073.329730] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2074.330502] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2074.341963] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2074.342240] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2074.342440] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2074.342630] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2074.343811] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8d7a4de-b40d-480f-8913-8ec9a0486f34 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2074.352602] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53deb293-acf0-4caf-9448-5489d29c1e38 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2074.366900] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-260be280-df3e-4fd6-b6d7-bef334dca991 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2074.373105] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-554172d5-97dd-4d50-866f-43b9de4a4cf5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2074.402366] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181195MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2074.402538] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2074.402762] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2074.464696] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance a6272c75-92de-45a0-8e3e-82e342f0475c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2074.464849] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 73ba3af8-9a29-4c63-9a55-c9879e74239d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2074.464976] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f1940470-82f6-41fb-bd36-96561ad20102 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2074.465116] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 55ca3e89-807f-473c-8b5b-346fc2ea23f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2074.465237] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aa9f1e61-ac26-495c-a698-5163661401a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2074.465358] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 16597080-42c7-40df-9893-38751d9ac11a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2074.465474] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b12b4099-9dd1-4219-823d-79cdef6a4e5e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2074.465589] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c6fbe481-a9e0-40d5-9cac-e4645a50be1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2074.465766] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2074.465900] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2074.554840] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98d31723-c0b1-4792-8662-19889ac5070e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2074.562214] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c6a9304-5300-4f5c-968b-9f70429609f0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2074.591164] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-176eb669-cae7-49ea-82eb-485605ad8a39 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2074.597862] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51758eb2-8857-4dec-8db0-24d7e0c11d7c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2074.610322] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2074.620145] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2074.633246] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2074.633428] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.231s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2075.633078] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2076.325934] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2078.331137] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2078.331487] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2079.330955] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2080.325257] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2082.331598] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2105.366033] env[60764]: WARNING oslo_vmware.rw_handles [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2105.366033] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2105.366033] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2105.366033] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2105.366033] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2105.366033] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 2105.366033] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2105.366033] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2105.366033] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2105.366033] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2105.366033] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2105.366033] env[60764]: ERROR oslo_vmware.rw_handles [ 2105.366716] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/b0da3c7f-f379-402d-a1da-6c53bee81f9b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2105.368622] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2105.368880] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Copying Virtual Disk [datastore2] vmware_temp/b0da3c7f-f379-402d-a1da-6c53bee81f9b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/b0da3c7f-f379-402d-a1da-6c53bee81f9b/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2105.369191] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8ea48412-0d41-4cd0-acf7-26dc6cd9777d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2105.378080] env[60764]: DEBUG oslo_vmware.api [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for the task: (returnval){ [ 2105.378080] env[60764]: value = "task-2205057" [ 2105.378080] env[60764]: _type = "Task" [ 2105.378080] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2105.385905] env[60764]: DEBUG oslo_vmware.api [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Task: {'id': task-2205057, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2105.888562] env[60764]: DEBUG oslo_vmware.exceptions [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2105.888840] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2105.889398] env[60764]: ERROR nova.compute.manager [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2105.889398] env[60764]: Faults: ['InvalidArgument'] [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Traceback (most recent call last): [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] yield resources [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] self.driver.spawn(context, instance, image_meta, [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] self._fetch_image_if_missing(context, vi) [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] image_cache(vi, tmp_image_ds_loc) [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] vm_util.copy_virtual_disk( [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] session._wait_for_task(vmdk_copy_task) [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] return self.wait_for_task(task_ref) [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] return evt.wait() [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] result = hub.switch() [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] return self.greenlet.switch() [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] self.f(*self.args, **self.kw) [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] raise exceptions.translate_fault(task_info.error) [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Faults: ['InvalidArgument'] [ 2105.889398] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] [ 2105.890431] env[60764]: INFO nova.compute.manager [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Terminating instance [ 2105.891242] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2105.891445] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2105.891678] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ea1b0664-9b2e-49ab-b0a9-6b770a70193a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2105.893940] env[60764]: DEBUG nova.compute.manager [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2105.894148] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2105.894843] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-832318be-0e60-4e8d-8992-6ef805c67531 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2105.901154] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2105.901352] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-628d0b67-970e-4c74-9bb5-8a28ea2526fa {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2105.903433] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2105.903602] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2105.904563] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-525b75f6-ee9d-4eb7-949e-d350f7b5c4ce {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2105.909503] env[60764]: DEBUG oslo_vmware.api [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Waiting for the task: (returnval){ [ 2105.909503] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]525a30a0-0ba9-f0fa-09f2-6b44c3802a7a" [ 2105.909503] env[60764]: _type = "Task" [ 2105.909503] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2105.920522] env[60764]: DEBUG oslo_vmware.api [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]525a30a0-0ba9-f0fa-09f2-6b44c3802a7a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2106.175745] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2106.175978] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2106.176181] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Deleting the datastore file [datastore2] a6272c75-92de-45a0-8e3e-82e342f0475c {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2106.176519] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-84fb02f7-fc3c-4928-b6b6-926085823de5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.182937] env[60764]: DEBUG oslo_vmware.api [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for the task: (returnval){ [ 2106.182937] env[60764]: value = "task-2205059" [ 2106.182937] env[60764]: _type = "Task" [ 2106.182937] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2106.190274] env[60764]: DEBUG oslo_vmware.api [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Task: {'id': task-2205059, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2106.419751] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2106.420044] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Creating directory with path [datastore2] vmware_temp/2d43cab5-c9a5-48b0-85bc-62437c9b7039/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2106.420247] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8eb7b880-9337-49e9-99e9-5e8b45525b1c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.431300] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Created directory with path [datastore2] vmware_temp/2d43cab5-c9a5-48b0-85bc-62437c9b7039/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2106.431480] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Fetch image to [datastore2] vmware_temp/2d43cab5-c9a5-48b0-85bc-62437c9b7039/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2106.431648] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/2d43cab5-c9a5-48b0-85bc-62437c9b7039/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2106.432367] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e07ce17-8233-4aa4-8982-a1a18c6265fb {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.438944] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ca7dea3-0ac0-44d5-998b-759fb73e5469 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.448209] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed6cb838-7446-49a0-b74a-f3b6bdd2e51c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.480600] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-010e2e0b-306a-44e4-b3b1-8e9c8629218f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.487212] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-107ea5ea-8bfd-45e5-b69d-e6af089e3c1b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.511545] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2106.643188] env[60764]: DEBUG oslo_vmware.rw_handles [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2d43cab5-c9a5-48b0-85bc-62437c9b7039/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2106.702707] env[60764]: DEBUG oslo_vmware.rw_handles [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2106.702927] env[60764]: DEBUG oslo_vmware.rw_handles [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2d43cab5-c9a5-48b0-85bc-62437c9b7039/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2106.707206] env[60764]: DEBUG oslo_vmware.api [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Task: {'id': task-2205059, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082383} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2106.707495] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2106.707713] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2106.707914] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2106.708112] env[60764]: INFO nova.compute.manager [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Took 0.81 seconds to destroy the instance on the hypervisor. [ 2106.710543] env[60764]: DEBUG nova.compute.claims [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2106.710638] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2106.710881] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2106.842333] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85f86586-b274-4ac9-84a5-f701acc96b5c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.849960] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fabc75cf-729d-44f5-bfc1-87efebffe31e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.879327] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eeb2707b-0f01-4890-8320-2c9526fc58a7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.886267] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d53fe9e-773f-46bf-b0f5-921736fe5233 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.899037] env[60764]: DEBUG nova.compute.provider_tree [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2106.907047] env[60764]: DEBUG nova.scheduler.client.report [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2106.919991] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.209s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2106.920559] env[60764]: ERROR nova.compute.manager [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2106.920559] env[60764]: Faults: ['InvalidArgument'] [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Traceback (most recent call last): [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] self.driver.spawn(context, instance, image_meta, [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] self._fetch_image_if_missing(context, vi) [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] image_cache(vi, tmp_image_ds_loc) [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] vm_util.copy_virtual_disk( [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] session._wait_for_task(vmdk_copy_task) [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] return self.wait_for_task(task_ref) [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] return evt.wait() [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] result = hub.switch() [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] return self.greenlet.switch() [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] self.f(*self.args, **self.kw) [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] raise exceptions.translate_fault(task_info.error) [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Faults: ['InvalidArgument'] [ 2106.920559] env[60764]: ERROR nova.compute.manager [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] [ 2106.921368] env[60764]: DEBUG nova.compute.utils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2106.922652] env[60764]: DEBUG nova.compute.manager [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Build of instance a6272c75-92de-45a0-8e3e-82e342f0475c was re-scheduled: A specified parameter was not correct: fileType [ 2106.922652] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2106.923077] env[60764]: DEBUG nova.compute.manager [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2106.923258] env[60764]: DEBUG nova.compute.manager [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2106.923428] env[60764]: DEBUG nova.compute.manager [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2106.923595] env[60764]: DEBUG nova.network.neutron [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2107.235121] env[60764]: DEBUG nova.network.neutron [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2107.247024] env[60764]: INFO nova.compute.manager [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Took 0.32 seconds to deallocate network for instance. [ 2107.338964] env[60764]: INFO nova.scheduler.client.report [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Deleted allocations for instance a6272c75-92de-45a0-8e3e-82e342f0475c [ 2107.360130] env[60764]: DEBUG oslo_concurrency.lockutils [None req-c6a84c8e-5eaf-47b2-a33d-b0845aef13fe tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "a6272c75-92de-45a0-8e3e-82e342f0475c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 625.375s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2107.360398] env[60764]: DEBUG oslo_concurrency.lockutils [None req-fd6c31aa-cced-42ac-aa4f-5d122233eb4d tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "a6272c75-92de-45a0-8e3e-82e342f0475c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 429.946s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2107.360630] env[60764]: DEBUG oslo_concurrency.lockutils [None req-fd6c31aa-cced-42ac-aa4f-5d122233eb4d tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "a6272c75-92de-45a0-8e3e-82e342f0475c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2107.360831] env[60764]: DEBUG oslo_concurrency.lockutils [None req-fd6c31aa-cced-42ac-aa4f-5d122233eb4d tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "a6272c75-92de-45a0-8e3e-82e342f0475c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2107.360992] env[60764]: DEBUG oslo_concurrency.lockutils [None req-fd6c31aa-cced-42ac-aa4f-5d122233eb4d tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "a6272c75-92de-45a0-8e3e-82e342f0475c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2107.363429] env[60764]: INFO nova.compute.manager [None req-fd6c31aa-cced-42ac-aa4f-5d122233eb4d tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Terminating instance [ 2107.365439] env[60764]: DEBUG nova.compute.manager [None req-fd6c31aa-cced-42ac-aa4f-5d122233eb4d tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2107.365632] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-fd6c31aa-cced-42ac-aa4f-5d122233eb4d tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2107.366121] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6447bc7a-526c-4eb7-916d-011d6eaced2c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2107.375444] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe4cf084-2046-4b4a-b690-e8c6250bde7f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2107.401743] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-fd6c31aa-cced-42ac-aa4f-5d122233eb4d tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a6272c75-92de-45a0-8e3e-82e342f0475c could not be found. [ 2107.401938] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-fd6c31aa-cced-42ac-aa4f-5d122233eb4d tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2107.402138] env[60764]: INFO nova.compute.manager [None req-fd6c31aa-cced-42ac-aa4f-5d122233eb4d tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2107.402372] env[60764]: DEBUG oslo.service.loopingcall [None req-fd6c31aa-cced-42ac-aa4f-5d122233eb4d tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2107.402829] env[60764]: DEBUG nova.compute.manager [-] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2107.402933] env[60764]: DEBUG nova.network.neutron [-] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2107.423375] env[60764]: DEBUG nova.network.neutron [-] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2107.431179] env[60764]: INFO nova.compute.manager [-] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] Took 0.03 seconds to deallocate network for instance. [ 2107.513822] env[60764]: DEBUG oslo_concurrency.lockutils [None req-fd6c31aa-cced-42ac-aa4f-5d122233eb4d tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "a6272c75-92de-45a0-8e3e-82e342f0475c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.153s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2107.515099] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "a6272c75-92de-45a0-8e3e-82e342f0475c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 404.246s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2107.515317] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: a6272c75-92de-45a0-8e3e-82e342f0475c] During sync_power_state the instance has a pending task (deleting). Skip. [ 2107.515500] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "a6272c75-92de-45a0-8e3e-82e342f0475c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2129.330875] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2129.331278] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2129.331278] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2129.349249] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2129.349388] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2129.349518] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2129.349643] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2129.349764] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2129.349884] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2129.350010] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2129.350137] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2133.329881] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2133.330221] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2134.330294] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2134.341293] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2134.341509] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2134.341669] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2134.341823] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2134.342945] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a0a800a-0b59-4322-9cad-4181ee094f28 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.353081] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bd8ca7e-b358-49d1-834a-daad23c8eb19 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.366453] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40a7e77d-75e7-420a-8793-6471ab5396d6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.372489] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b15a622-5602-4022-8b86-6653bc5a6b1b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.400600] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181268MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2134.400760] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2134.401116] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2134.462869] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 73ba3af8-9a29-4c63-9a55-c9879e74239d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2134.463036] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f1940470-82f6-41fb-bd36-96561ad20102 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2134.463169] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 55ca3e89-807f-473c-8b5b-346fc2ea23f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2134.463290] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aa9f1e61-ac26-495c-a698-5163661401a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2134.463405] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 16597080-42c7-40df-9893-38751d9ac11a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2134.463519] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b12b4099-9dd1-4219-823d-79cdef6a4e5e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2134.463631] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c6fbe481-a9e0-40d5-9cac-e4645a50be1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2134.463804] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2134.463938] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2134.545572] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-456e2996-af0e-4289-b281-cf35536194b4 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.552822] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96ad894a-1fa5-49e7-815b-7a0d9669a585 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.583336] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b8647d0-2e24-46ee-961b-18df496a4793 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.589901] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8966f71c-d29c-4b17-967c-f59b1390b751 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.602605] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2134.610414] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2134.623114] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2134.623286] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.222s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2137.623676] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2138.329608] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2138.329765] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2140.325837] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2140.329425] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2142.332194] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2156.095050] env[60764]: WARNING oslo_vmware.rw_handles [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2156.095050] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2156.095050] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2156.095050] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2156.095050] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2156.095050] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 2156.095050] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2156.095050] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2156.095050] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2156.095050] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2156.095050] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2156.095050] env[60764]: ERROR oslo_vmware.rw_handles [ 2156.095050] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/2d43cab5-c9a5-48b0-85bc-62437c9b7039/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2156.096957] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2156.097209] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Copying Virtual Disk [datastore2] vmware_temp/2d43cab5-c9a5-48b0-85bc-62437c9b7039/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/2d43cab5-c9a5-48b0-85bc-62437c9b7039/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2156.097500] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-59805b81-346e-4251-b1f2-c0f3f5fb9a3f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2156.105213] env[60764]: DEBUG oslo_vmware.api [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Waiting for the task: (returnval){ [ 2156.105213] env[60764]: value = "task-2205060" [ 2156.105213] env[60764]: _type = "Task" [ 2156.105213] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2156.113131] env[60764]: DEBUG oslo_vmware.api [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Task: {'id': task-2205060, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2156.616399] env[60764]: DEBUG oslo_vmware.exceptions [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2156.616631] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2156.617192] env[60764]: ERROR nova.compute.manager [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2156.617192] env[60764]: Faults: ['InvalidArgument'] [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Traceback (most recent call last): [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] yield resources [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] self.driver.spawn(context, instance, image_meta, [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] self._fetch_image_if_missing(context, vi) [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] image_cache(vi, tmp_image_ds_loc) [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] vm_util.copy_virtual_disk( [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] session._wait_for_task(vmdk_copy_task) [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] return self.wait_for_task(task_ref) [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] return evt.wait() [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] result = hub.switch() [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] return self.greenlet.switch() [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] self.f(*self.args, **self.kw) [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] raise exceptions.translate_fault(task_info.error) [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Faults: ['InvalidArgument'] [ 2156.617192] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] [ 2156.618067] env[60764]: INFO nova.compute.manager [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Terminating instance [ 2156.619064] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2156.619287] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2156.619525] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b85d4a6b-0c9b-42ca-a705-46bc211058ad {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2156.621649] env[60764]: DEBUG nova.compute.manager [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2156.621839] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2156.622558] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a50c4a98-899e-4d75-a773-165bfa0c7866 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2156.629327] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2156.629560] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b05222d8-cad6-4391-b7a0-837952a34e22 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2156.631659] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2156.631824] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2156.632778] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9b2407e5-946a-42a8-81b2-2e3689dfb392 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2156.637402] env[60764]: DEBUG oslo_vmware.api [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Waiting for the task: (returnval){ [ 2156.637402] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5201afbc-75a8-d59a-d512-3f35cc0dbe8f" [ 2156.637402] env[60764]: _type = "Task" [ 2156.637402] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2156.645356] env[60764]: DEBUG oslo_vmware.api [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5201afbc-75a8-d59a-d512-3f35cc0dbe8f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2156.698123] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2156.698352] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2156.698509] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Deleting the datastore file [datastore2] 73ba3af8-9a29-4c63-9a55-c9879e74239d {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2156.698768] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9d3cd00c-9804-441e-aa10-b7af857883a2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2156.705283] env[60764]: DEBUG oslo_vmware.api [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Waiting for the task: (returnval){ [ 2156.705283] env[60764]: value = "task-2205062" [ 2156.705283] env[60764]: _type = "Task" [ 2156.705283] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2156.712574] env[60764]: DEBUG oslo_vmware.api [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Task: {'id': task-2205062, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2157.147453] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2157.147811] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Creating directory with path [datastore2] vmware_temp/174f044d-4a2d-4295-ab74-cbc063b653d6/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2157.147934] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e428e25a-320f-41b2-9818-9b5af2480f94 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.159539] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Created directory with path [datastore2] vmware_temp/174f044d-4a2d-4295-ab74-cbc063b653d6/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2157.159727] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Fetch image to [datastore2] vmware_temp/174f044d-4a2d-4295-ab74-cbc063b653d6/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2157.159880] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/174f044d-4a2d-4295-ab74-cbc063b653d6/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2157.160644] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5316e30-5488-4eaf-93b7-0e2eae5bb839 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.167401] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51f36f4d-7db8-40a5-8a21-e05ed6148b0a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.177656] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62e89081-e1fa-410e-9338-12f2de54c8af {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.211709] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65f634ab-7a1f-48a2-b870-f2b5a7b44cc9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.218925] env[60764]: DEBUG oslo_vmware.api [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Task: {'id': task-2205062, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.08026} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2157.220377] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2157.220605] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2157.220728] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2157.220942] env[60764]: INFO nova.compute.manager [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2157.223051] env[60764]: DEBUG nova.compute.claims [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2157.223224] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2157.223435] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2157.225976] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f6398140-cf06-4783-a1d3-533dc21af0a5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.247758] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2157.357743] env[60764]: DEBUG oslo_vmware.rw_handles [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/174f044d-4a2d-4295-ab74-cbc063b653d6/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2157.413403] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14adbf41-d90e-47b9-87b2-b03b802ac92e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.417997] env[60764]: DEBUG oslo_vmware.rw_handles [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2157.418196] env[60764]: DEBUG oslo_vmware.rw_handles [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/174f044d-4a2d-4295-ab74-cbc063b653d6/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2157.422635] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-614cb2ab-7989-4c08-94bf-33cfbeec49d7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.455028] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e15da666-c5c8-4dc4-8921-6dfde0ceb912 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.461244] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64f63c6f-2664-4b1c-a04a-d3fcb37de28b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.474391] env[60764]: DEBUG nova.compute.provider_tree [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2157.483076] env[60764]: DEBUG nova.scheduler.client.report [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2157.497446] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.274s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2157.497954] env[60764]: ERROR nova.compute.manager [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2157.497954] env[60764]: Faults: ['InvalidArgument'] [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Traceback (most recent call last): [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] self.driver.spawn(context, instance, image_meta, [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] self._fetch_image_if_missing(context, vi) [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] image_cache(vi, tmp_image_ds_loc) [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] vm_util.copy_virtual_disk( [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] session._wait_for_task(vmdk_copy_task) [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] return self.wait_for_task(task_ref) [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] return evt.wait() [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] result = hub.switch() [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] return self.greenlet.switch() [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] self.f(*self.args, **self.kw) [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] raise exceptions.translate_fault(task_info.error) [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Faults: ['InvalidArgument'] [ 2157.497954] env[60764]: ERROR nova.compute.manager [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] [ 2157.498824] env[60764]: DEBUG nova.compute.utils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2157.500390] env[60764]: DEBUG nova.compute.manager [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Build of instance 73ba3af8-9a29-4c63-9a55-c9879e74239d was re-scheduled: A specified parameter was not correct: fileType [ 2157.500390] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2157.500757] env[60764]: DEBUG nova.compute.manager [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2157.500922] env[60764]: DEBUG nova.compute.manager [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2157.501104] env[60764]: DEBUG nova.compute.manager [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2157.501269] env[60764]: DEBUG nova.network.neutron [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2157.814179] env[60764]: DEBUG nova.network.neutron [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2157.826813] env[60764]: INFO nova.compute.manager [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Took 0.33 seconds to deallocate network for instance. [ 2157.922894] env[60764]: INFO nova.scheduler.client.report [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Deleted allocations for instance 73ba3af8-9a29-4c63-9a55-c9879e74239d [ 2157.948863] env[60764]: DEBUG oslo_concurrency.lockutils [None req-713ae92a-64ff-4c8e-93b7-533c181c2c3d tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "73ba3af8-9a29-4c63-9a55-c9879e74239d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 613.712s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2157.949172] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "73ba3af8-9a29-4c63-9a55-c9879e74239d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 454.680s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2157.949305] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] During sync_power_state the instance has a pending task (spawning). Skip. [ 2157.949472] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "73ba3af8-9a29-4c63-9a55-c9879e74239d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2157.949984] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ed3fb308-05a2-4385-9cf6-fe8d88b9ea44 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "73ba3af8-9a29-4c63-9a55-c9879e74239d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 418.157s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2157.950216] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ed3fb308-05a2-4385-9cf6-fe8d88b9ea44 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Acquiring lock "73ba3af8-9a29-4c63-9a55-c9879e74239d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2157.950419] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ed3fb308-05a2-4385-9cf6-fe8d88b9ea44 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "73ba3af8-9a29-4c63-9a55-c9879e74239d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2157.950578] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ed3fb308-05a2-4385-9cf6-fe8d88b9ea44 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "73ba3af8-9a29-4c63-9a55-c9879e74239d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2157.952508] env[60764]: INFO nova.compute.manager [None req-ed3fb308-05a2-4385-9cf6-fe8d88b9ea44 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Terminating instance [ 2157.954189] env[60764]: DEBUG nova.compute.manager [None req-ed3fb308-05a2-4385-9cf6-fe8d88b9ea44 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2157.954384] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ed3fb308-05a2-4385-9cf6-fe8d88b9ea44 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2157.954647] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ccc104b5-d169-4195-b601-fffb87f7670b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.964829] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aca711c1-e32c-424e-84d9-ebf9811289f6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2157.991231] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-ed3fb308-05a2-4385-9cf6-fe8d88b9ea44 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 73ba3af8-9a29-4c63-9a55-c9879e74239d could not be found. [ 2157.991449] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ed3fb308-05a2-4385-9cf6-fe8d88b9ea44 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2157.991630] env[60764]: INFO nova.compute.manager [None req-ed3fb308-05a2-4385-9cf6-fe8d88b9ea44 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2157.991877] env[60764]: DEBUG oslo.service.loopingcall [None req-ed3fb308-05a2-4385-9cf6-fe8d88b9ea44 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2157.992437] env[60764]: DEBUG nova.compute.manager [-] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2157.992537] env[60764]: DEBUG nova.network.neutron [-] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2158.014968] env[60764]: DEBUG nova.network.neutron [-] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2158.023425] env[60764]: INFO nova.compute.manager [-] [instance: 73ba3af8-9a29-4c63-9a55-c9879e74239d] Took 0.03 seconds to deallocate network for instance. [ 2158.115022] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ed3fb308-05a2-4385-9cf6-fe8d88b9ea44 tempest-AttachVolumeShelveTestJSON-686217160 tempest-AttachVolumeShelveTestJSON-686217160-project-member] Lock "73ba3af8-9a29-4c63-9a55-c9879e74239d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.165s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2186.167713] env[60764]: DEBUG oslo_concurrency.lockutils [None req-80fe3324-3b0d-41ad-9228-e2502732a5dd tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquiring lock "c6fbe481-a9e0-40d5-9cac-e4645a50be1a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2191.330645] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2191.331171] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2191.331171] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2191.347790] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2191.348020] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2191.348267] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2191.348479] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2191.348686] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2191.348892] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2191.349109] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2194.329599] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2194.329993] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2194.342325] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2194.342536] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2194.342709] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2194.342866] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2194.344048] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-890eeed8-ea5e-4a00-87ff-184f1ecfe764 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2194.352878] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a237bb14-ed8c-4b29-aa9b-4b6b44681bf6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2194.366648] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6131fce7-43a3-4610-93c5-52fc67aab95b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2194.372816] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72c5082a-2430-4560-968e-d53105323e3d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2194.403172] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181232MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2194.403314] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2194.403499] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2194.461199] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance f1940470-82f6-41fb-bd36-96561ad20102 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2194.461359] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 55ca3e89-807f-473c-8b5b-346fc2ea23f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2194.461484] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aa9f1e61-ac26-495c-a698-5163661401a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2194.461603] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 16597080-42c7-40df-9893-38751d9ac11a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2194.461718] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b12b4099-9dd1-4219-823d-79cdef6a4e5e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2194.461833] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c6fbe481-a9e0-40d5-9cac-e4645a50be1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2194.462015] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2194.462161] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2194.533736] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03dffd48-5751-411c-885c-10b7cef5996c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2194.541104] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee2ede70-629b-4dd5-b83d-fbf7cce37e43 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2194.569971] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43a3d3c8-877b-4a53-8856-dd36c36c477c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2194.576619] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c583716b-b271-4cbe-90f2-6520dff4a688 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2194.589439] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2194.597383] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2194.609886] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2194.610075] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.207s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2195.610591] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2197.325952] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2199.330570] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2199.331022] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2199.331022] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2200.325802] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2202.331067] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2202.331418] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2204.429815] env[60764]: WARNING oslo_vmware.rw_handles [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2204.429815] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2204.429815] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2204.429815] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2204.429815] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2204.429815] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 2204.429815] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2204.429815] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2204.429815] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2204.429815] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2204.429815] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2204.429815] env[60764]: ERROR oslo_vmware.rw_handles [ 2204.430483] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/174f044d-4a2d-4295-ab74-cbc063b653d6/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2204.432262] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2204.432527] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Copying Virtual Disk [datastore2] vmware_temp/174f044d-4a2d-4295-ab74-cbc063b653d6/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/174f044d-4a2d-4295-ab74-cbc063b653d6/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2204.432808] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-892ef86f-3fd4-4338-a331-8a4a92e880d6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2204.441315] env[60764]: DEBUG oslo_vmware.api [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Waiting for the task: (returnval){ [ 2204.441315] env[60764]: value = "task-2205063" [ 2204.441315] env[60764]: _type = "Task" [ 2204.441315] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2204.448946] env[60764]: DEBUG oslo_vmware.api [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Task: {'id': task-2205063, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2204.951260] env[60764]: DEBUG oslo_vmware.exceptions [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2204.951550] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2204.952115] env[60764]: ERROR nova.compute.manager [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2204.952115] env[60764]: Faults: ['InvalidArgument'] [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] Traceback (most recent call last): [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] yield resources [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] self.driver.spawn(context, instance, image_meta, [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] self._fetch_image_if_missing(context, vi) [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] image_cache(vi, tmp_image_ds_loc) [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] vm_util.copy_virtual_disk( [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] session._wait_for_task(vmdk_copy_task) [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] return self.wait_for_task(task_ref) [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] return evt.wait() [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] result = hub.switch() [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] return self.greenlet.switch() [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] self.f(*self.args, **self.kw) [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] raise exceptions.translate_fault(task_info.error) [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] Faults: ['InvalidArgument'] [ 2204.952115] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] [ 2204.953257] env[60764]: INFO nova.compute.manager [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Terminating instance [ 2204.953998] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2204.954221] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2204.954462] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-027e18f1-9dd3-49dd-9b19-b9221233dd4c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2204.956890] env[60764]: DEBUG nova.compute.manager [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2204.957090] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2204.957812] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f91330f0-3f42-4142-b200-0711ad5eba38 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2204.964858] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2204.965135] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9c2e22c4-3874-4231-845e-0d35a668a577 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2204.967453] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2204.967623] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2204.968567] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-efdcb096-6da7-4c74-9652-7ff9eea502d6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2204.973264] env[60764]: DEBUG oslo_vmware.api [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Waiting for the task: (returnval){ [ 2204.973264] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52a998ac-04f2-abe9-5585-011dca015cdf" [ 2204.973264] env[60764]: _type = "Task" [ 2204.973264] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2204.979938] env[60764]: DEBUG oslo_vmware.api [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52a998ac-04f2-abe9-5585-011dca015cdf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2205.043529] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2205.043738] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2205.043879] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Deleting the datastore file [datastore2] f1940470-82f6-41fb-bd36-96561ad20102 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2205.044173] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-745a2eb1-a11d-4e23-939b-b47eb5aa71ae {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2205.049884] env[60764]: DEBUG oslo_vmware.api [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Waiting for the task: (returnval){ [ 2205.049884] env[60764]: value = "task-2205065" [ 2205.049884] env[60764]: _type = "Task" [ 2205.049884] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2205.057521] env[60764]: DEBUG oslo_vmware.api [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Task: {'id': task-2205065, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2205.483657] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2205.484046] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Creating directory with path [datastore2] vmware_temp/5947906e-b603-4dd7-b0ed-2721a470e174/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2205.484160] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-19fe0c33-f298-4846-b025-8f5b14268e8f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2205.495051] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Created directory with path [datastore2] vmware_temp/5947906e-b603-4dd7-b0ed-2721a470e174/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2205.495255] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Fetch image to [datastore2] vmware_temp/5947906e-b603-4dd7-b0ed-2721a470e174/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2205.495425] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/5947906e-b603-4dd7-b0ed-2721a470e174/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2205.496186] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3870118d-f062-41ed-8f32-b83a3955c741 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2205.502480] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82fa4fd4-b638-4f57-824c-f5b4463ccb51 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2205.511132] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cfc7368-a4b9-4714-bf48-cfecf2b6d5c5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2205.541953] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32a4eb61-94e4-4288-bc9f-a2f91c0c2f9c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2205.547452] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9348b494-c995-4f08-8756-4930e9066025 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2205.557523] env[60764]: DEBUG oslo_vmware.api [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Task: {'id': task-2205065, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.061115} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2205.557747] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2205.557926] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2205.558122] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2205.558294] env[60764]: INFO nova.compute.manager [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2205.560364] env[60764]: DEBUG nova.compute.claims [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2205.560528] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2205.560742] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2205.566284] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2205.684379] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2643095-4ed8-4a93-bcfe-33bf26a11be1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2205.691783] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-417ad565-b097-4d68-ac84-2fb2f5e5fb96 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2205.725118] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac20102c-abdd-47b2-a82e-7d55d03972c0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2205.731848] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fe867da-4422-4907-a1e4-f04094b4861e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2205.744707] env[60764]: DEBUG nova.compute.provider_tree [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2205.746439] env[60764]: DEBUG oslo_vmware.rw_handles [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5947906e-b603-4dd7-b0ed-2721a470e174/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2205.804875] env[60764]: DEBUG nova.scheduler.client.report [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2205.809606] env[60764]: DEBUG oslo_vmware.rw_handles [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2205.809771] env[60764]: DEBUG oslo_vmware.rw_handles [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5947906e-b603-4dd7-b0ed-2721a470e174/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2205.818200] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.257s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2205.818723] env[60764]: ERROR nova.compute.manager [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2205.818723] env[60764]: Faults: ['InvalidArgument'] [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] Traceback (most recent call last): [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] self.driver.spawn(context, instance, image_meta, [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] self._fetch_image_if_missing(context, vi) [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] image_cache(vi, tmp_image_ds_loc) [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] vm_util.copy_virtual_disk( [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] session._wait_for_task(vmdk_copy_task) [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] return self.wait_for_task(task_ref) [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] return evt.wait() [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] result = hub.switch() [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] return self.greenlet.switch() [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] self.f(*self.args, **self.kw) [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] raise exceptions.translate_fault(task_info.error) [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] Faults: ['InvalidArgument'] [ 2205.818723] env[60764]: ERROR nova.compute.manager [instance: f1940470-82f6-41fb-bd36-96561ad20102] [ 2205.819577] env[60764]: DEBUG nova.compute.utils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2205.820832] env[60764]: DEBUG nova.compute.manager [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Build of instance f1940470-82f6-41fb-bd36-96561ad20102 was re-scheduled: A specified parameter was not correct: fileType [ 2205.820832] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2205.821205] env[60764]: DEBUG nova.compute.manager [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2205.821375] env[60764]: DEBUG nova.compute.manager [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2205.821542] env[60764]: DEBUG nova.compute.manager [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2205.821736] env[60764]: DEBUG nova.network.neutron [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2206.154418] env[60764]: DEBUG nova.network.neutron [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2206.165344] env[60764]: INFO nova.compute.manager [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Took 0.34 seconds to deallocate network for instance. [ 2206.261264] env[60764]: INFO nova.scheduler.client.report [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Deleted allocations for instance f1940470-82f6-41fb-bd36-96561ad20102 [ 2206.281138] env[60764]: DEBUG oslo_concurrency.lockutils [None req-f0737805-ffae-4e5c-83e3-440e1443a68c tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "f1940470-82f6-41fb-bd36-96561ad20102" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 636.018s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2206.281423] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3b2edd42-5af3-46f2-8b77-d94ad47f350a tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "f1940470-82f6-41fb-bd36-96561ad20102" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 440.930s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2206.281661] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3b2edd42-5af3-46f2-8b77-d94ad47f350a tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Acquiring lock "f1940470-82f6-41fb-bd36-96561ad20102-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2206.281861] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3b2edd42-5af3-46f2-8b77-d94ad47f350a tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "f1940470-82f6-41fb-bd36-96561ad20102-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2206.282036] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3b2edd42-5af3-46f2-8b77-d94ad47f350a tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "f1940470-82f6-41fb-bd36-96561ad20102-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2206.283958] env[60764]: INFO nova.compute.manager [None req-3b2edd42-5af3-46f2-8b77-d94ad47f350a tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Terminating instance [ 2206.285622] env[60764]: DEBUG nova.compute.manager [None req-3b2edd42-5af3-46f2-8b77-d94ad47f350a tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2206.285841] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3b2edd42-5af3-46f2-8b77-d94ad47f350a tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2206.286315] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f8d84395-a136-4906-99f3-c04bbcb80f6e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2206.295091] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08d88e7c-b60a-4256-88b4-6f293f48e7d8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2206.321950] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-3b2edd42-5af3-46f2-8b77-d94ad47f350a tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f1940470-82f6-41fb-bd36-96561ad20102 could not be found. [ 2206.322169] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-3b2edd42-5af3-46f2-8b77-d94ad47f350a tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2206.322343] env[60764]: INFO nova.compute.manager [None req-3b2edd42-5af3-46f2-8b77-d94ad47f350a tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2206.322578] env[60764]: DEBUG oslo.service.loopingcall [None req-3b2edd42-5af3-46f2-8b77-d94ad47f350a tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2206.322796] env[60764]: DEBUG nova.compute.manager [-] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2206.322895] env[60764]: DEBUG nova.network.neutron [-] [instance: f1940470-82f6-41fb-bd36-96561ad20102] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2206.354377] env[60764]: DEBUG nova.network.neutron [-] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2206.362501] env[60764]: INFO nova.compute.manager [-] [instance: f1940470-82f6-41fb-bd36-96561ad20102] Took 0.04 seconds to deallocate network for instance. [ 2206.449992] env[60764]: DEBUG oslo_concurrency.lockutils [None req-3b2edd42-5af3-46f2-8b77-d94ad47f350a tempest-ServersTestJSON-1786688920 tempest-ServersTestJSON-1786688920-project-member] Lock "f1940470-82f6-41fb-bd36-96561ad20102" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.168s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2252.331443] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2252.331821] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2252.331821] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2252.348573] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2252.348719] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2252.348898] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2252.348988] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2252.349117] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2252.349238] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2253.359244] env[60764]: WARNING oslo_vmware.rw_handles [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2253.359244] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2253.359244] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2253.359244] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2253.359244] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2253.359244] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 2253.359244] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2253.359244] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2253.359244] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2253.359244] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2253.359244] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2253.359244] env[60764]: ERROR oslo_vmware.rw_handles [ 2253.359876] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/5947906e-b603-4dd7-b0ed-2721a470e174/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2253.361907] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2253.362180] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Copying Virtual Disk [datastore2] vmware_temp/5947906e-b603-4dd7-b0ed-2721a470e174/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/5947906e-b603-4dd7-b0ed-2721a470e174/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2253.362465] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4345e4cd-6ed7-4e84-b7fe-0b31421ca980 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.370358] env[60764]: DEBUG oslo_vmware.api [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Waiting for the task: (returnval){ [ 2253.370358] env[60764]: value = "task-2205066" [ 2253.370358] env[60764]: _type = "Task" [ 2253.370358] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2253.378296] env[60764]: DEBUG oslo_vmware.api [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Task: {'id': task-2205066, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2253.881568] env[60764]: DEBUG oslo_vmware.exceptions [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2253.881861] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2253.882421] env[60764]: ERROR nova.compute.manager [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2253.882421] env[60764]: Faults: ['InvalidArgument'] [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Traceback (most recent call last): [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] yield resources [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] self.driver.spawn(context, instance, image_meta, [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] self._fetch_image_if_missing(context, vi) [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] image_cache(vi, tmp_image_ds_loc) [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] vm_util.copy_virtual_disk( [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] session._wait_for_task(vmdk_copy_task) [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] return self.wait_for_task(task_ref) [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] return evt.wait() [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] result = hub.switch() [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] return self.greenlet.switch() [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] self.f(*self.args, **self.kw) [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] raise exceptions.translate_fault(task_info.error) [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Faults: ['InvalidArgument'] [ 2253.882421] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] [ 2253.883395] env[60764]: INFO nova.compute.manager [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Terminating instance [ 2253.884310] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2253.884517] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2253.884756] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-689f3124-87ea-4f5c-8ff3-6c24ef43eb0b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.887010] env[60764]: DEBUG nova.compute.manager [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2253.887211] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2253.887926] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ea196d7-b245-4ffa-928e-db1a030ca79a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.894771] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2253.894992] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c0131b73-3e9e-49b5-9482-05000f984174 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.897239] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2253.897412] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2253.898343] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-94b9b618-6c64-455f-92be-2c58a819f04f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.903352] env[60764]: DEBUG oslo_vmware.api [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Waiting for the task: (returnval){ [ 2253.903352] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52ca3c0f-9708-01dc-6da8-411b59a9c7ed" [ 2253.903352] env[60764]: _type = "Task" [ 2253.903352] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2253.910728] env[60764]: DEBUG oslo_vmware.api [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]52ca3c0f-9708-01dc-6da8-411b59a9c7ed, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2253.978728] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2253.978965] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2253.979185] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Deleting the datastore file [datastore2] 55ca3e89-807f-473c-8b5b-346fc2ea23f8 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2253.979452] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-70b2643e-8b48-4d1d-a696-19060ce2c296 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.986192] env[60764]: DEBUG oslo_vmware.api [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Waiting for the task: (returnval){ [ 2253.986192] env[60764]: value = "task-2205068" [ 2253.986192] env[60764]: _type = "Task" [ 2253.986192] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2253.993640] env[60764]: DEBUG oslo_vmware.api [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Task: {'id': task-2205068, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2254.329631] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2254.412723] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2254.413075] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Creating directory with path [datastore2] vmware_temp/cb23b14f-70f5-4d31-8285-8391e5206108/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2254.413211] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b4df1526-ad34-4e6a-8e79-752972e19ad0 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.424118] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Created directory with path [datastore2] vmware_temp/cb23b14f-70f5-4d31-8285-8391e5206108/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2254.424296] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Fetch image to [datastore2] vmware_temp/cb23b14f-70f5-4d31-8285-8391e5206108/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2254.424458] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/cb23b14f-70f5-4d31-8285-8391e5206108/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2254.425159] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-492a3740-89d6-4907-97a5-29bd6b096486 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.431432] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da47c8f8-4eb4-4c6f-941d-99f0a18cc5a7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.440112] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f4c860b-e4ed-4e39-8b86-bdb3d3857de1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.470709] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b635ab25-4bf8-4157-bdab-82c514579176 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.475953] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7521fd83-e554-4104-a7b1-a75ae4f2cba6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.493765] env[60764]: DEBUG oslo_vmware.api [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Task: {'id': task-2205068, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074779} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2254.494995] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2254.495207] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2254.495382] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2254.495553] env[60764]: INFO nova.compute.manager [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2254.497294] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2254.499312] env[60764]: DEBUG nova.compute.claims [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2254.499481] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2254.499702] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2254.547062] env[60764]: DEBUG oslo_vmware.rw_handles [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cb23b14f-70f5-4d31-8285-8391e5206108/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2254.607775] env[60764]: DEBUG oslo_vmware.rw_handles [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2254.607924] env[60764]: DEBUG oslo_vmware.rw_handles [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cb23b14f-70f5-4d31-8285-8391e5206108/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2254.658525] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c0c93ca-25b9-443a-b3dc-e4d213bde29c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.665951] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f2a5540-0528-437b-9adf-baab4273a102 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.694645] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4d4c09c-ec46-47e8-8200-dca10a63b5d2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.701055] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa9e6b68-19e2-465b-b27b-d9491d12f0d9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.713369] env[60764]: DEBUG nova.compute.provider_tree [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2254.721727] env[60764]: DEBUG nova.scheduler.client.report [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2254.734619] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.235s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2254.735150] env[60764]: ERROR nova.compute.manager [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2254.735150] env[60764]: Faults: ['InvalidArgument'] [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Traceback (most recent call last): [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] self.driver.spawn(context, instance, image_meta, [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] self._fetch_image_if_missing(context, vi) [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] image_cache(vi, tmp_image_ds_loc) [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] vm_util.copy_virtual_disk( [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] session._wait_for_task(vmdk_copy_task) [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] return self.wait_for_task(task_ref) [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] return evt.wait() [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] result = hub.switch() [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] return self.greenlet.switch() [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] self.f(*self.args, **self.kw) [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] raise exceptions.translate_fault(task_info.error) [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Faults: ['InvalidArgument'] [ 2254.735150] env[60764]: ERROR nova.compute.manager [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] [ 2254.736146] env[60764]: DEBUG nova.compute.utils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2254.737151] env[60764]: DEBUG nova.compute.manager [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Build of instance 55ca3e89-807f-473c-8b5b-346fc2ea23f8 was re-scheduled: A specified parameter was not correct: fileType [ 2254.737151] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2254.737522] env[60764]: DEBUG nova.compute.manager [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2254.737689] env[60764]: DEBUG nova.compute.manager [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2254.737854] env[60764]: DEBUG nova.compute.manager [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2254.738030] env[60764]: DEBUG nova.network.neutron [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2255.040507] env[60764]: DEBUG nova.network.neutron [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2255.050342] env[60764]: INFO nova.compute.manager [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Took 0.31 seconds to deallocate network for instance. [ 2255.136175] env[60764]: INFO nova.scheduler.client.report [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Deleted allocations for instance 55ca3e89-807f-473c-8b5b-346fc2ea23f8 [ 2255.157053] env[60764]: DEBUG oslo_concurrency.lockutils [None req-ffc2c70d-f362-4257-9441-2e6d5d43c440 tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "55ca3e89-807f-473c-8b5b-346fc2ea23f8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 660.795s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2255.157053] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b1655091-6afc-4582-95ca-5faaf5f3dcad tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "55ca3e89-807f-473c-8b5b-346fc2ea23f8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 463.845s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2255.157053] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b1655091-6afc-4582-95ca-5faaf5f3dcad tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Acquiring lock "55ca3e89-807f-473c-8b5b-346fc2ea23f8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2255.157053] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b1655091-6afc-4582-95ca-5faaf5f3dcad tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "55ca3e89-807f-473c-8b5b-346fc2ea23f8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2255.157053] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b1655091-6afc-4582-95ca-5faaf5f3dcad tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "55ca3e89-807f-473c-8b5b-346fc2ea23f8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2255.158831] env[60764]: INFO nova.compute.manager [None req-b1655091-6afc-4582-95ca-5faaf5f3dcad tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Terminating instance [ 2255.160535] env[60764]: DEBUG nova.compute.manager [None req-b1655091-6afc-4582-95ca-5faaf5f3dcad tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2255.160727] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-b1655091-6afc-4582-95ca-5faaf5f3dcad tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2255.161214] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9e5e27de-ebb9-4cd1-af2e-3c901aa28787 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.170842] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ffc6e97-2997-4cf8-bede-d61f08c9a648 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.195184] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-b1655091-6afc-4582-95ca-5faaf5f3dcad tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 55ca3e89-807f-473c-8b5b-346fc2ea23f8 could not be found. [ 2255.195402] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-b1655091-6afc-4582-95ca-5faaf5f3dcad tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2255.195579] env[60764]: INFO nova.compute.manager [None req-b1655091-6afc-4582-95ca-5faaf5f3dcad tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Took 0.03 seconds to destroy the instance on the hypervisor. [ 2255.195855] env[60764]: DEBUG oslo.service.loopingcall [None req-b1655091-6afc-4582-95ca-5faaf5f3dcad tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2255.196107] env[60764]: DEBUG nova.compute.manager [-] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2255.196213] env[60764]: DEBUG nova.network.neutron [-] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2255.223378] env[60764]: DEBUG nova.network.neutron [-] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2255.231750] env[60764]: INFO nova.compute.manager [-] [instance: 55ca3e89-807f-473c-8b5b-346fc2ea23f8] Took 0.04 seconds to deallocate network for instance. [ 2255.314276] env[60764]: DEBUG oslo_concurrency.lockutils [None req-b1655091-6afc-4582-95ca-5faaf5f3dcad tempest-DeleteServersTestJSON-273648637 tempest-DeleteServersTestJSON-273648637-project-member] Lock "55ca3e89-807f-473c-8b5b-346fc2ea23f8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.158s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2255.329758] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2255.340198] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2255.340669] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2255.340669] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2255.340781] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2255.341737] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49907548-43cc-4bdf-916c-5da550d42190 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.350570] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b65832c1-7994-4742-b850-6ecf60cc29f9 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.364270] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4341fce6-486d-41e6-9a8a-2cdb51a65e0b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.370332] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e356b092-ecd6-4269-8a9a-076ef20b4019 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.400114] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181253MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2255.400264] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2255.400429] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2255.449021] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance aa9f1e61-ac26-495c-a698-5163661401a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2255.449021] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 16597080-42c7-40df-9893-38751d9ac11a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2255.449021] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b12b4099-9dd1-4219-823d-79cdef6a4e5e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2255.449021] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c6fbe481-a9e0-40d5-9cac-e4645a50be1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2255.449413] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2255.449413] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2255.511166] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b9cc8ce-7458-4d97-b105-a5f4cfd21b32 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.518688] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2ea6e4c-0416-4bf3-9b25-86bf6ef22858 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.548009] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a8df583-e289-414d-a4fd-738e7ec7c202 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.555176] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5a49c5a-b1fc-404e-9116-ad96243bdaec {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.568811] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2255.577426] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2255.593386] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2255.593572] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2257.594317] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2259.330761] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2259.331137] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2260.325688] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2261.330478] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2263.329892] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2264.330592] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2267.919456] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "7002a10e-fb8b-4892-b010-b943b4fd405d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2267.919782] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "7002a10e-fb8b-4892-b010-b943b4fd405d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2267.930833] env[60764]: DEBUG nova.compute.manager [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Starting instance... {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2267.977008] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2267.977266] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2267.978679] env[60764]: INFO nova.compute.claims [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2268.088019] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a3784f2-e2ec-4481-8091-4f3d4b3021f2 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2268.095236] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e49811a-4bf3-4ffe-8211-3326e3a3458d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2268.124031] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-598d67ad-99ef-4f5a-97e1-6544fd0d3737 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2268.130582] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fc3e0f9-bf7b-407d-90b9-92d56ed68e17 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2268.143035] env[60764]: DEBUG nova.compute.provider_tree [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2268.152219] env[60764]: DEBUG nova.scheduler.client.report [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2268.165047] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.188s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2268.165492] env[60764]: DEBUG nova.compute.manager [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Start building networks asynchronously for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2268.197810] env[60764]: DEBUG nova.compute.utils [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Using /dev/sd instead of None {{(pid=60764) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2268.199342] env[60764]: DEBUG nova.compute.manager [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Allocating IP information in the background. {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2268.199342] env[60764]: DEBUG nova.network.neutron [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] allocate_for_instance() {{(pid=60764) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2268.207180] env[60764]: DEBUG nova.compute.manager [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Start building block device mappings for instance. {{(pid=60764) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2268.266116] env[60764]: DEBUG nova.policy [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8165c7e326c4016a42ba39f68abfce6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5ed1c9589f44a86909b417fac99dab5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60764) authorize /opt/stack/nova/nova/policy.py:203}} [ 2268.269466] env[60764]: DEBUG nova.compute.manager [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Start spawning the instance on the hypervisor. {{(pid=60764) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2268.294701] env[60764]: DEBUG nova.virt.hardware [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T11:22:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T11:22:41Z,direct_url=,disk_format='vmdk',id=04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='c84a3b13dad24426842b23ff07092e6c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T11:22:42Z,virtual_size=,visibility=), allow threads: False {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2268.294957] env[60764]: DEBUG nova.virt.hardware [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Flavor limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2268.295136] env[60764]: DEBUG nova.virt.hardware [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Image limits 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2268.295317] env[60764]: DEBUG nova.virt.hardware [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Flavor pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2268.295461] env[60764]: DEBUG nova.virt.hardware [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Image pref 0:0:0 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2268.295633] env[60764]: DEBUG nova.virt.hardware [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60764) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2268.295840] env[60764]: DEBUG nova.virt.hardware [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2268.295998] env[60764]: DEBUG nova.virt.hardware [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2268.296179] env[60764]: DEBUG nova.virt.hardware [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Got 1 possible topologies {{(pid=60764) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2268.296339] env[60764]: DEBUG nova.virt.hardware [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2268.296506] env[60764]: DEBUG nova.virt.hardware [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60764) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2268.297398] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7288465a-f9ea-4ebe-aad5-19518df425e5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2268.305781] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d6da05e-d646-4673-99a4-b183ab8389df {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2268.728305] env[60764]: DEBUG nova.network.neutron [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Successfully created port: 0b6f5a94-6801-4707-9428-7eb58343688f {{(pid=60764) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2269.258200] env[60764]: DEBUG nova.compute.manager [req-300084f5-be90-4922-b068-1c03301893ca req-05b8262e-c2ad-4b31-ae6d-27e9322d8dc2 service nova] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Received event network-vif-plugged-0b6f5a94-6801-4707-9428-7eb58343688f {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2269.258471] env[60764]: DEBUG oslo_concurrency.lockutils [req-300084f5-be90-4922-b068-1c03301893ca req-05b8262e-c2ad-4b31-ae6d-27e9322d8dc2 service nova] Acquiring lock "7002a10e-fb8b-4892-b010-b943b4fd405d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2269.258629] env[60764]: DEBUG oslo_concurrency.lockutils [req-300084f5-be90-4922-b068-1c03301893ca req-05b8262e-c2ad-4b31-ae6d-27e9322d8dc2 service nova] Lock "7002a10e-fb8b-4892-b010-b943b4fd405d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2269.258791] env[60764]: DEBUG oslo_concurrency.lockutils [req-300084f5-be90-4922-b068-1c03301893ca req-05b8262e-c2ad-4b31-ae6d-27e9322d8dc2 service nova] Lock "7002a10e-fb8b-4892-b010-b943b4fd405d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2269.258955] env[60764]: DEBUG nova.compute.manager [req-300084f5-be90-4922-b068-1c03301893ca req-05b8262e-c2ad-4b31-ae6d-27e9322d8dc2 service nova] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] No waiting events found dispatching network-vif-plugged-0b6f5a94-6801-4707-9428-7eb58343688f {{(pid=60764) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2269.259194] env[60764]: WARNING nova.compute.manager [req-300084f5-be90-4922-b068-1c03301893ca req-05b8262e-c2ad-4b31-ae6d-27e9322d8dc2 service nova] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Received unexpected event network-vif-plugged-0b6f5a94-6801-4707-9428-7eb58343688f for instance with vm_state building and task_state spawning. [ 2269.336046] env[60764]: DEBUG nova.network.neutron [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Successfully updated port: 0b6f5a94-6801-4707-9428-7eb58343688f {{(pid=60764) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2269.348425] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "refresh_cache-7002a10e-fb8b-4892-b010-b943b4fd405d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2269.348575] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquired lock "refresh_cache-7002a10e-fb8b-4892-b010-b943b4fd405d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2269.348757] env[60764]: DEBUG nova.network.neutron [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2269.410421] env[60764]: DEBUG nova.network.neutron [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2269.562674] env[60764]: DEBUG nova.network.neutron [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Updating instance_info_cache with network_info: [{"id": "0b6f5a94-6801-4707-9428-7eb58343688f", "address": "fa:16:3e:1c:2c:68", "network": {"id": "6fca304c-0605-4df1-816c-6a41d3a44163", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1810377619-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b5ed1c9589f44a86909b417fac99dab5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "32463b6d-4569-4755-8a29-873a028690a7", "external-id": "nsx-vlan-transportzone-349", "segmentation_id": 349, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0b6f5a94-68", "ovs_interfaceid": "0b6f5a94-6801-4707-9428-7eb58343688f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2269.573814] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Releasing lock "refresh_cache-7002a10e-fb8b-4892-b010-b943b4fd405d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2269.574089] env[60764]: DEBUG nova.compute.manager [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Instance network_info: |[{"id": "0b6f5a94-6801-4707-9428-7eb58343688f", "address": "fa:16:3e:1c:2c:68", "network": {"id": "6fca304c-0605-4df1-816c-6a41d3a44163", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1810377619-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b5ed1c9589f44a86909b417fac99dab5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "32463b6d-4569-4755-8a29-873a028690a7", "external-id": "nsx-vlan-transportzone-349", "segmentation_id": 349, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0b6f5a94-68", "ovs_interfaceid": "0b6f5a94-6801-4707-9428-7eb58343688f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60764) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2269.574460] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1c:2c:68', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '32463b6d-4569-4755-8a29-873a028690a7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0b6f5a94-6801-4707-9428-7eb58343688f', 'vif_model': 'vmxnet3'}] {{(pid=60764) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2269.582132] env[60764]: DEBUG oslo.service.loopingcall [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2269.582465] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Creating VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2269.582686] env[60764]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6b69345b-9e00-4f5a-95ae-1d82b512dc1c {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2269.602811] env[60764]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2269.602811] env[60764]: value = "task-2205069" [ 2269.602811] env[60764]: _type = "Task" [ 2269.602811] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2269.609966] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205069, 'name': CreateVM_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2270.112814] env[60764]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205069, 'name': CreateVM_Task, 'duration_secs': 0.304909} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2270.112965] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Created VM on the ESX host {{(pid=60764) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2270.113683] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2270.113789] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2270.114117] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2270.114408] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b499713a-5d19-4f76-8aff-46193c50b3d1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2270.118483] env[60764]: DEBUG oslo_vmware.api [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for the task: (returnval){ [ 2270.118483] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]521e62de-e1b2-3b99-f351-f963eac9c555" [ 2270.118483] env[60764]: _type = "Task" [ 2270.118483] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2270.127249] env[60764]: DEBUG oslo_vmware.api [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]521e62de-e1b2-3b99-f351-f963eac9c555, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2270.630364] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2270.630757] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Processing image 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2270.630813] env[60764]: DEBUG oslo_concurrency.lockutils [None req-d8457d46-4560-463a-bee7-2cfb75d215f2 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2271.287490] env[60764]: DEBUG nova.compute.manager [req-b00dc800-10c6-4fed-853b-eead0f49646e req-f45fae12-70f3-42e3-b56d-633f94250d1f service nova] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Received event network-changed-0b6f5a94-6801-4707-9428-7eb58343688f {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2271.287694] env[60764]: DEBUG nova.compute.manager [req-b00dc800-10c6-4fed-853b-eead0f49646e req-f45fae12-70f3-42e3-b56d-633f94250d1f service nova] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Refreshing instance network info cache due to event network-changed-0b6f5a94-6801-4707-9428-7eb58343688f. {{(pid=60764) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2271.287904] env[60764]: DEBUG oslo_concurrency.lockutils [req-b00dc800-10c6-4fed-853b-eead0f49646e req-f45fae12-70f3-42e3-b56d-633f94250d1f service nova] Acquiring lock "refresh_cache-7002a10e-fb8b-4892-b010-b943b4fd405d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2271.288083] env[60764]: DEBUG oslo_concurrency.lockutils [req-b00dc800-10c6-4fed-853b-eead0f49646e req-f45fae12-70f3-42e3-b56d-633f94250d1f service nova] Acquired lock "refresh_cache-7002a10e-fb8b-4892-b010-b943b4fd405d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2271.288245] env[60764]: DEBUG nova.network.neutron [req-b00dc800-10c6-4fed-853b-eead0f49646e req-f45fae12-70f3-42e3-b56d-633f94250d1f service nova] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Refreshing network info cache for port 0b6f5a94-6801-4707-9428-7eb58343688f {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2271.517578] env[60764]: DEBUG nova.network.neutron [req-b00dc800-10c6-4fed-853b-eead0f49646e req-f45fae12-70f3-42e3-b56d-633f94250d1f service nova] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Updated VIF entry in instance network info cache for port 0b6f5a94-6801-4707-9428-7eb58343688f. {{(pid=60764) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2271.517924] env[60764]: DEBUG nova.network.neutron [req-b00dc800-10c6-4fed-853b-eead0f49646e req-f45fae12-70f3-42e3-b56d-633f94250d1f service nova] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Updating instance_info_cache with network_info: [{"id": "0b6f5a94-6801-4707-9428-7eb58343688f", "address": "fa:16:3e:1c:2c:68", "network": {"id": "6fca304c-0605-4df1-816c-6a41d3a44163", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1810377619-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b5ed1c9589f44a86909b417fac99dab5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "32463b6d-4569-4755-8a29-873a028690a7", "external-id": "nsx-vlan-transportzone-349", "segmentation_id": 349, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0b6f5a94-68", "ovs_interfaceid": "0b6f5a94-6801-4707-9428-7eb58343688f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2271.526950] env[60764]: DEBUG oslo_concurrency.lockutils [req-b00dc800-10c6-4fed-853b-eead0f49646e req-f45fae12-70f3-42e3-b56d-633f94250d1f service nova] Releasing lock "refresh_cache-7002a10e-fb8b-4892-b010-b943b4fd405d" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2296.235233] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2296.235728] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Getting list of instances from cluster (obj){ [ 2296.235728] env[60764]: value = "domain-c8" [ 2296.235728] env[60764]: _type = "ClusterComputeResource" [ 2296.235728] env[60764]: } {{(pid=60764) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2296.236842] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37dbff91-0472-4330-82c4-166d0d55e12e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2296.249198] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Got total of 5 instances {{(pid=60764) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2303.378585] env[60764]: WARNING oslo_vmware.rw_handles [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2303.378585] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2303.378585] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2303.378585] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2303.378585] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2303.378585] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 2303.378585] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2303.378585] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2303.378585] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2303.378585] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2303.378585] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2303.378585] env[60764]: ERROR oslo_vmware.rw_handles [ 2303.379386] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/cb23b14f-70f5-4d31-8285-8391e5206108/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2303.381313] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2303.381569] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Copying Virtual Disk [datastore2] vmware_temp/cb23b14f-70f5-4d31-8285-8391e5206108/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/cb23b14f-70f5-4d31-8285-8391e5206108/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2303.381850] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8acf4dd0-47d5-438a-90d5-a16ece495936 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2303.389147] env[60764]: DEBUG oslo_vmware.api [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Waiting for the task: (returnval){ [ 2303.389147] env[60764]: value = "task-2205070" [ 2303.389147] env[60764]: _type = "Task" [ 2303.389147] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2303.397205] env[60764]: DEBUG oslo_vmware.api [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Task: {'id': task-2205070, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2303.900052] env[60764]: DEBUG oslo_vmware.exceptions [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2303.900052] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2303.900052] env[60764]: ERROR nova.compute.manager [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2303.900052] env[60764]: Faults: ['InvalidArgument'] [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Traceback (most recent call last): [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] yield resources [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] self.driver.spawn(context, instance, image_meta, [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] self._fetch_image_if_missing(context, vi) [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] image_cache(vi, tmp_image_ds_loc) [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] vm_util.copy_virtual_disk( [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] session._wait_for_task(vmdk_copy_task) [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] return self.wait_for_task(task_ref) [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] return evt.wait() [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] result = hub.switch() [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] return self.greenlet.switch() [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] self.f(*self.args, **self.kw) [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] raise exceptions.translate_fault(task_info.error) [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Faults: ['InvalidArgument'] [ 2303.900052] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] [ 2303.901128] env[60764]: INFO nova.compute.manager [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Terminating instance [ 2303.901854] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2303.902077] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2303.902317] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-78a2db1d-ec8e-438b-8ff1-27f4895eb18e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2303.904528] env[60764]: DEBUG nova.compute.manager [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2303.904719] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2303.905452] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-255405e1-1828-4e3b-b3e1-355a33417763 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2303.912080] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2303.912370] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-09206c97-4af1-48b2-8c3f-5d348747fe7d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2303.914529] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2303.914698] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2303.915680] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7d37161c-1dce-40a9-8bc6-c9bc1249f573 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2303.920622] env[60764]: DEBUG oslo_vmware.api [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for the task: (returnval){ [ 2303.920622] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]522d0ee4-5021-dcc3-3bc1-2c8a545a5029" [ 2303.920622] env[60764]: _type = "Task" [ 2303.920622] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2303.929251] env[60764]: DEBUG oslo_vmware.api [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]522d0ee4-5021-dcc3-3bc1-2c8a545a5029, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2303.977748] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2303.977994] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2303.978149] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Deleting the datastore file [datastore2] aa9f1e61-ac26-495c-a698-5163661401a5 {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2303.978415] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-23fa3717-323f-49ee-aaea-f026d6cb39df {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2303.984373] env[60764]: DEBUG oslo_vmware.api [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Waiting for the task: (returnval){ [ 2303.984373] env[60764]: value = "task-2205072" [ 2303.984373] env[60764]: _type = "Task" [ 2303.984373] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2303.991930] env[60764]: DEBUG oslo_vmware.api [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Task: {'id': task-2205072, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2304.430835] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2304.431171] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Creating directory with path [datastore2] vmware_temp/f1298fda-d8c8-4830-857f-df66676670bf/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2304.431284] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5852e506-a54c-4045-bf79-b40b268e1d97 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.442136] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Created directory with path [datastore2] vmware_temp/f1298fda-d8c8-4830-857f-df66676670bf/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2304.442319] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Fetch image to [datastore2] vmware_temp/f1298fda-d8c8-4830-857f-df66676670bf/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2304.442487] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/f1298fda-d8c8-4830-857f-df66676670bf/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2304.443203] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49b480a8-f7fe-408a-abfa-57c806bb8f9f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.449374] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3644dc41-6979-403c-87fa-7c8051a08b33 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.458057] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-811f3f32-e3a9-4366-98e1-e2c2c07e8a57 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.490128] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17d5c855-5205-4926-928d-50eb0162b80d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.496790] env[60764]: DEBUG oslo_vmware.api [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Task: {'id': task-2205072, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079213} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2304.498188] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2304.498375] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2304.498548] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2304.498718] env[60764]: INFO nova.compute.manager [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Took 0.59 seconds to destroy the instance on the hypervisor. [ 2304.500425] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4b0d7ad7-f477-49ad-a94e-e5bfc76d1a75 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.502199] env[60764]: DEBUG nova.compute.claims [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2304.502369] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2304.502588] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2304.527482] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2304.581543] env[60764]: DEBUG oslo_vmware.rw_handles [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f1298fda-d8c8-4830-857f-df66676670bf/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2304.639259] env[60764]: DEBUG oslo_vmware.rw_handles [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2304.639446] env[60764]: DEBUG oslo_vmware.rw_handles [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f1298fda-d8c8-4830-857f-df66676670bf/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2304.668445] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c54d02fb-4a43-4f1d-89d1-e8362d128238 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.675596] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4008590b-00e2-49e1-8dbd-f215f46140bc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.705196] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7dde00a-3923-44ad-8d46-cf6fa6409f5b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.711539] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68fa2d90-e8e6-4311-8782-fa515e5c9a15 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.724132] env[60764]: DEBUG nova.compute.provider_tree [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2304.732575] env[60764]: DEBUG nova.scheduler.client.report [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2304.747921] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.245s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2304.748450] env[60764]: ERROR nova.compute.manager [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2304.748450] env[60764]: Faults: ['InvalidArgument'] [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Traceback (most recent call last): [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] self.driver.spawn(context, instance, image_meta, [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] self._fetch_image_if_missing(context, vi) [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] image_cache(vi, tmp_image_ds_loc) [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] vm_util.copy_virtual_disk( [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] session._wait_for_task(vmdk_copy_task) [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] return self.wait_for_task(task_ref) [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] return evt.wait() [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] result = hub.switch() [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] return self.greenlet.switch() [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] self.f(*self.args, **self.kw) [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] raise exceptions.translate_fault(task_info.error) [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Faults: ['InvalidArgument'] [ 2304.748450] env[60764]: ERROR nova.compute.manager [instance: aa9f1e61-ac26-495c-a698-5163661401a5] [ 2304.749192] env[60764]: DEBUG nova.compute.utils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2304.750456] env[60764]: DEBUG nova.compute.manager [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Build of instance aa9f1e61-ac26-495c-a698-5163661401a5 was re-scheduled: A specified parameter was not correct: fileType [ 2304.750456] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2304.750823] env[60764]: DEBUG nova.compute.manager [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2304.750991] env[60764]: DEBUG nova.compute.manager [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2304.751177] env[60764]: DEBUG nova.compute.manager [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2304.751336] env[60764]: DEBUG nova.network.neutron [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2305.059478] env[60764]: DEBUG nova.network.neutron [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2305.070877] env[60764]: INFO nova.compute.manager [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Took 0.32 seconds to deallocate network for instance. [ 2305.163396] env[60764]: INFO nova.scheduler.client.report [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Deleted allocations for instance aa9f1e61-ac26-495c-a698-5163661401a5 [ 2305.185303] env[60764]: DEBUG oslo_concurrency.lockutils [None req-11b1b404-1959-4a0a-acd1-6d2a1ec6fb72 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "aa9f1e61-ac26-495c-a698-5163661401a5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 555.016s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2305.185553] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a8a5eaae-89ab-4975-9b6b-4e7e005337e3 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "aa9f1e61-ac26-495c-a698-5163661401a5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 358.991s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2305.185850] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a8a5eaae-89ab-4975-9b6b-4e7e005337e3 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Acquiring lock "aa9f1e61-ac26-495c-a698-5163661401a5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2305.186079] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a8a5eaae-89ab-4975-9b6b-4e7e005337e3 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "aa9f1e61-ac26-495c-a698-5163661401a5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2305.186255] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a8a5eaae-89ab-4975-9b6b-4e7e005337e3 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "aa9f1e61-ac26-495c-a698-5163661401a5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2305.188356] env[60764]: INFO nova.compute.manager [None req-a8a5eaae-89ab-4975-9b6b-4e7e005337e3 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Terminating instance [ 2305.190570] env[60764]: DEBUG nova.compute.manager [None req-a8a5eaae-89ab-4975-9b6b-4e7e005337e3 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2305.190801] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-a8a5eaae-89ab-4975-9b6b-4e7e005337e3 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2305.191290] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1c0c8509-1181-42bd-9ad0-89fd66862c46 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2305.200451] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27c5ec92-3137-4849-9996-6d821796e06d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2305.227262] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-a8a5eaae-89ab-4975-9b6b-4e7e005337e3 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance aa9f1e61-ac26-495c-a698-5163661401a5 could not be found. [ 2305.227468] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-a8a5eaae-89ab-4975-9b6b-4e7e005337e3 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2305.227655] env[60764]: INFO nova.compute.manager [None req-a8a5eaae-89ab-4975-9b6b-4e7e005337e3 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2305.227894] env[60764]: DEBUG oslo.service.loopingcall [None req-a8a5eaae-89ab-4975-9b6b-4e7e005337e3 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2305.228392] env[60764]: DEBUG nova.compute.manager [-] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2305.228498] env[60764]: DEBUG nova.network.neutron [-] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2305.257993] env[60764]: DEBUG nova.network.neutron [-] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2305.265877] env[60764]: INFO nova.compute.manager [-] [instance: aa9f1e61-ac26-495c-a698-5163661401a5] Took 0.04 seconds to deallocate network for instance. [ 2305.348287] env[60764]: DEBUG oslo_concurrency.lockutils [None req-a8a5eaae-89ab-4975-9b6b-4e7e005337e3 tempest-ServerDiskConfigTestJSON-621872599 tempest-ServerDiskConfigTestJSON-621872599-project-member] Lock "aa9f1e61-ac26-495c-a698-5163661401a5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.163s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2312.246068] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2312.260515] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Getting list of instances from cluster (obj){ [ 2312.260515] env[60764]: value = "domain-c8" [ 2312.260515] env[60764]: _type = "ClusterComputeResource" [ 2312.260515] env[60764]: } {{(pid=60764) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2312.261794] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1867da2e-65d2-4446-a13c-247ff318effa {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2312.273689] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Got total of 4 instances {{(pid=60764) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2312.273936] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid 16597080-42c7-40df-9893-38751d9ac11a {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2312.274081] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid b12b4099-9dd1-4219-823d-79cdef6a4e5e {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2312.274246] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid c6fbe481-a9e0-40d5-9cac-e4645a50be1a {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2312.274398] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Triggering sync for uuid 7002a10e-fb8b-4892-b010-b943b4fd405d {{(pid=60764) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2312.274684] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "16597080-42c7-40df-9893-38751d9ac11a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2312.274909] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "b12b4099-9dd1-4219-823d-79cdef6a4e5e" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2312.275120] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "c6fbe481-a9e0-40d5-9cac-e4645a50be1a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2312.275320] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "7002a10e-fb8b-4892-b010-b943b4fd405d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2314.359923] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2314.359923] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2314.359923] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2314.372937] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2314.373101] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2314.373228] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2314.373354] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2314.373476] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2314.373923] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2316.330638] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2316.342560] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2316.342809] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2316.343009] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2316.343195] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2316.344395] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cbf3cd5-af41-40cc-b66b-a1f4eeab7865 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.353470] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd3cdee2-301f-4ecc-b007-5c1b0f016d1f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.368320] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a78951b-2c01-418d-aa91-4ccc2086fdaa {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.374856] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f41470e3-0737-4e0d-bb8e-f3edbf4f3a3e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.404217] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181268MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2316.404351] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2316.404534] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2316.488438] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 16597080-42c7-40df-9893-38751d9ac11a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2316.488599] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b12b4099-9dd1-4219-823d-79cdef6a4e5e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2316.488727] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c6fbe481-a9e0-40d5-9cac-e4645a50be1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2316.488851] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7002a10e-fb8b-4892-b010-b943b4fd405d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2316.489044] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2316.489187] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2316.503840] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Refreshing inventories for resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2316.516037] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Updating ProviderTree inventory for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2316.516210] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Updating inventory in ProviderTree for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2316.526007] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Refreshing aggregate associations for resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4, aggregates: None {{(pid=60764) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2316.541699] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Refreshing trait associations for resource provider 67a94047-1c18-43e8-9b47-05a1d30bcca4, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_VMDK {{(pid=60764) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2316.593368] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef1392df-0d76-4e47-8b10-7ffaa82e7e6b {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.600917] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77224fb5-f78c-42da-9093-76ec74c4b4f8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.629876] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-007efbc7-1a2d-4229-9c21-fdde6069e640 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.636714] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22026277-da4b-43d2-adac-2d90551ea98a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.649671] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2316.657273] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2316.671915] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2316.672107] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.268s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2316.672312] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2317.677638] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2318.330581] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2318.330815] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Cleaning up deleted instances {{(pid=60764) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2318.340180] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] There are 0 instances to clean {{(pid=60764) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2319.335129] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2319.349018] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2319.349227] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2320.339773] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2323.329854] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2324.330640] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2325.330532] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2325.330709] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Cleaning up deleted instances with incomplete migration {{(pid=60764) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2326.338457] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2353.395315] env[60764]: WARNING oslo_vmware.rw_handles [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2353.395315] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2353.395315] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2353.395315] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2353.395315] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2353.395315] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 2353.395315] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2353.395315] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2353.395315] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2353.395315] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2353.395315] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2353.395315] env[60764]: ERROR oslo_vmware.rw_handles [ 2353.396067] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/f1298fda-d8c8-4830-857f-df66676670bf/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2353.397903] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2353.398225] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Copying Virtual Disk [datastore2] vmware_temp/f1298fda-d8c8-4830-857f-df66676670bf/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/f1298fda-d8c8-4830-857f-df66676670bf/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2353.398533] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b0571705-41ef-425e-b4bc-1777911a4a7e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.406741] env[60764]: DEBUG oslo_vmware.api [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for the task: (returnval){ [ 2353.406741] env[60764]: value = "task-2205073" [ 2353.406741] env[60764]: _type = "Task" [ 2353.406741] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2353.414527] env[60764]: DEBUG oslo_vmware.api [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Task: {'id': task-2205073, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2353.917246] env[60764]: DEBUG oslo_vmware.exceptions [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2353.917533] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2353.918171] env[60764]: ERROR nova.compute.manager [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2353.918171] env[60764]: Faults: ['InvalidArgument'] [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] Traceback (most recent call last): [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] yield resources [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] self.driver.spawn(context, instance, image_meta, [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] self._fetch_image_if_missing(context, vi) [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] image_cache(vi, tmp_image_ds_loc) [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] vm_util.copy_virtual_disk( [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] session._wait_for_task(vmdk_copy_task) [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] return self.wait_for_task(task_ref) [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] return evt.wait() [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] result = hub.switch() [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] return self.greenlet.switch() [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] self.f(*self.args, **self.kw) [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] raise exceptions.translate_fault(task_info.error) [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] Faults: ['InvalidArgument'] [ 2353.918171] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] [ 2353.919345] env[60764]: INFO nova.compute.manager [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Terminating instance [ 2353.920075] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2353.920273] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2353.920505] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c1496da8-07ce-43a3-8f68-0564a4614e35 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.922670] env[60764]: DEBUG nova.compute.manager [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2353.922863] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2353.923578] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e728fd49-b8d3-45b8-b8b6-ddfa8dfe9ca7 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.930257] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2353.931236] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2da7cb4c-bef7-47af-a78d-5a2c65142f07 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.932545] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2353.932703] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2353.933361] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3097db89-4e15-49b9-be85-095a3df3d39e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2353.938758] env[60764]: DEBUG oslo_vmware.api [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Waiting for the task: (returnval){ [ 2353.938758] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]524fe88f-ee2d-04ee-862a-f4a4549b4e85" [ 2353.938758] env[60764]: _type = "Task" [ 2353.938758] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2353.945775] env[60764]: DEBUG oslo_vmware.api [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]524fe88f-ee2d-04ee-862a-f4a4549b4e85, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2354.006089] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2354.006309] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2354.006443] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Deleting the datastore file [datastore2] 16597080-42c7-40df-9893-38751d9ac11a {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2354.006728] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-35288d0b-4c88-406b-8377-ce39c3699e42 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.012838] env[60764]: DEBUG oslo_vmware.api [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for the task: (returnval){ [ 2354.012838] env[60764]: value = "task-2205075" [ 2354.012838] env[60764]: _type = "Task" [ 2354.012838] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2354.020395] env[60764]: DEBUG oslo_vmware.api [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Task: {'id': task-2205075, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2354.449350] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2354.449856] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Creating directory with path [datastore2] vmware_temp/c8d4cd49-4943-4584-b519-220c8249dda2/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2354.449856] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3c5a4479-f7ff-492f-887c-f6d50cd82a1f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.460960] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Created directory with path [datastore2] vmware_temp/c8d4cd49-4943-4584-b519-220c8249dda2/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2354.461226] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Fetch image to [datastore2] vmware_temp/c8d4cd49-4943-4584-b519-220c8249dda2/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2354.461306] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/c8d4cd49-4943-4584-b519-220c8249dda2/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2354.461991] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-884fd7ec-bc17-417f-9c2c-f9bbe3e6d958 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.468327] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a88fe08-fa3b-473b-9fb5-bed43fe824e3 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.478047] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71184e5a-af62-4045-b9c9-f77472adfe60 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.508714] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8be0456-3e32-4773-aaab-c85937ced241 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.516835] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8d63dc8e-a011-4ea4-8d09-ae53c80e343e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.522672] env[60764]: DEBUG oslo_vmware.api [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Task: {'id': task-2205075, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066284} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2354.522904] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2354.523108] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2354.523284] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2354.523465] env[60764]: INFO nova.compute.manager [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2354.525517] env[60764]: DEBUG nova.compute.claims [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2354.525688] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2354.525940] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2354.538313] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2354.591014] env[60764]: DEBUG oslo_vmware.rw_handles [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c8d4cd49-4943-4584-b519-220c8249dda2/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2354.650531] env[60764]: DEBUG oslo_vmware.rw_handles [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2354.650716] env[60764]: DEBUG oslo_vmware.rw_handles [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c8d4cd49-4943-4584-b519-220c8249dda2/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2354.691846] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b26efa2-8ad4-43ae-a4e9-6ba06ef77268 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.698830] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52f7ae63-e53d-4ed3-8e24-87c5cd51ec41 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.728659] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01e357ba-d952-4819-83e2-091dd7c87877 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2354.734900] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90a03a52-e418-4666-b862-d78f0f1e7b02 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2355.428660] env[60764]: DEBUG nova.compute.provider_tree [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2355.437290] env[60764]: DEBUG nova.scheduler.client.report [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2355.451040] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.925s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2355.451592] env[60764]: ERROR nova.compute.manager [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2355.451592] env[60764]: Faults: ['InvalidArgument'] [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] Traceback (most recent call last): [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] self.driver.spawn(context, instance, image_meta, [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] self._fetch_image_if_missing(context, vi) [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] image_cache(vi, tmp_image_ds_loc) [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] vm_util.copy_virtual_disk( [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] session._wait_for_task(vmdk_copy_task) [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] return self.wait_for_task(task_ref) [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] return evt.wait() [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] result = hub.switch() [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] return self.greenlet.switch() [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] self.f(*self.args, **self.kw) [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] raise exceptions.translate_fault(task_info.error) [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] Faults: ['InvalidArgument'] [ 2355.451592] env[60764]: ERROR nova.compute.manager [instance: 16597080-42c7-40df-9893-38751d9ac11a] [ 2355.452975] env[60764]: DEBUG nova.compute.utils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2355.453550] env[60764]: DEBUG nova.compute.manager [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Build of instance 16597080-42c7-40df-9893-38751d9ac11a was re-scheduled: A specified parameter was not correct: fileType [ 2355.453550] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2355.453915] env[60764]: DEBUG nova.compute.manager [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2355.454097] env[60764]: DEBUG nova.compute.manager [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2355.454264] env[60764]: DEBUG nova.compute.manager [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2355.454420] env[60764]: DEBUG nova.network.neutron [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2355.736221] env[60764]: DEBUG nova.network.neutron [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2355.749627] env[60764]: INFO nova.compute.manager [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Took 0.30 seconds to deallocate network for instance. [ 2355.887472] env[60764]: INFO nova.scheduler.client.report [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Deleted allocations for instance 16597080-42c7-40df-9893-38751d9ac11a [ 2355.921544] env[60764]: DEBUG oslo_concurrency.lockutils [None req-63bad87d-ab87-407b-af90-4f2c84c68455 tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "16597080-42c7-40df-9893-38751d9ac11a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 481.475s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2355.921800] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8b3e1525-8817-4e8f-ac39-c162ef3891ff tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "16597080-42c7-40df-9893-38751d9ac11a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 285.333s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2355.922061] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8b3e1525-8817-4e8f-ac39-c162ef3891ff tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Acquiring lock "16597080-42c7-40df-9893-38751d9ac11a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2355.922230] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8b3e1525-8817-4e8f-ac39-c162ef3891ff tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "16597080-42c7-40df-9893-38751d9ac11a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2355.922395] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8b3e1525-8817-4e8f-ac39-c162ef3891ff tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "16597080-42c7-40df-9893-38751d9ac11a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2355.924578] env[60764]: INFO nova.compute.manager [None req-8b3e1525-8817-4e8f-ac39-c162ef3891ff tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Terminating instance [ 2355.926274] env[60764]: DEBUG nova.compute.manager [None req-8b3e1525-8817-4e8f-ac39-c162ef3891ff tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2355.926462] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-8b3e1525-8817-4e8f-ac39-c162ef3891ff tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2355.926926] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-df7daf66-1293-4f17-99cc-670afe2ec0a6 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2355.935910] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b48a1aa-0043-425b-b884-d8a5e41db08d {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2355.959889] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-8b3e1525-8817-4e8f-ac39-c162ef3891ff tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 16597080-42c7-40df-9893-38751d9ac11a could not be found. [ 2355.960611] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-8b3e1525-8817-4e8f-ac39-c162ef3891ff tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2355.960611] env[60764]: INFO nova.compute.manager [None req-8b3e1525-8817-4e8f-ac39-c162ef3891ff tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Took 0.03 seconds to destroy the instance on the hypervisor. [ 2355.960611] env[60764]: DEBUG oslo.service.loopingcall [None req-8b3e1525-8817-4e8f-ac39-c162ef3891ff tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2355.960736] env[60764]: DEBUG nova.compute.manager [-] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2355.960736] env[60764]: DEBUG nova.network.neutron [-] [instance: 16597080-42c7-40df-9893-38751d9ac11a] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2355.992783] env[60764]: DEBUG nova.network.neutron [-] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2356.001325] env[60764]: INFO nova.compute.manager [-] [instance: 16597080-42c7-40df-9893-38751d9ac11a] Took 0.04 seconds to deallocate network for instance. [ 2356.091192] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8b3e1525-8817-4e8f-ac39-c162ef3891ff tempest-ImagesTestJSON-2052909825 tempest-ImagesTestJSON-2052909825-project-member] Lock "16597080-42c7-40df-9893-38751d9ac11a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.169s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2356.091949] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "16597080-42c7-40df-9893-38751d9ac11a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 43.817s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2356.092152] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 16597080-42c7-40df-9893-38751d9ac11a] During sync_power_state the instance has a pending task (deleting). Skip. [ 2356.092313] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "16597080-42c7-40df-9893-38751d9ac11a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2375.331417] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2375.331740] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Starting heal instance info cache {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2375.331740] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Rebuilding the list of instances to heal {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2375.345991] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2375.346188] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2375.346302] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: 7002a10e-fb8b-4892-b010-b943b4fd405d] Skipping network cache update for instance because it is Building. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2375.346425] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Didn't find any instances for network info cache update. {{(pid=60764) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2376.329596] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2376.329852] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2376.344370] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2376.344723] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2376.344723] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2376.344885] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60764) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2376.346064] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99eabe07-a8e9-49ec-8681-e962c5bae388 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2376.356457] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97c42eda-fe10-4cf1-8a57-e9bd7716f14a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2376.368759] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a620c58-b76f-4c35-9184-1d8c9d2267e5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2376.375283] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-885745ee-b4a9-4f85-aebd-8df7b5b7952e {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2376.404167] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181251MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60764) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2376.414775] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2376.414775] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2376.514938] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance b12b4099-9dd1-4219-823d-79cdef6a4e5e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2376.514938] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance c6fbe481-a9e0-40d5-9cac-e4645a50be1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2376.514938] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Instance 7002a10e-fb8b-4892-b010-b943b4fd405d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60764) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2376.515133] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2376.515172] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=60764) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2376.612023] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-979f0d0b-8ba8-4621-a112-f5640dcb72cc {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2376.621868] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7c0ea3b-217f-4fd7-9ad9-7423d29bdc20 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2376.650398] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-546d8304-45c5-4daa-8658-a77dd6d32c02 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2376.657148] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eff5cbeb-cc82-4997-9322-bb953a6be634 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2376.669835] env[60764]: DEBUG nova.compute.provider_tree [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2376.678193] env[60764]: DEBUG nova.scheduler.client.report [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2376.691211] env[60764]: DEBUG nova.compute.resource_tracker [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60764) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2376.691381] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.287s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2378.691511] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2379.329720] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2379.329903] env[60764]: DEBUG nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60764) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2381.325673] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2382.582035] env[60764]: DEBUG oslo_concurrency.lockutils [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquiring lock "b12b4099-9dd1-4219-823d-79cdef6a4e5e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2384.330119] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2385.330238] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2387.330605] env[60764]: DEBUG oslo_service.periodic_task [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60764) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2402.342500] env[60764]: WARNING oslo_vmware.rw_handles [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2402.342500] env[60764]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2402.342500] env[60764]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2402.342500] env[60764]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2402.342500] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2402.342500] env[60764]: ERROR oslo_vmware.rw_handles response.begin() [ 2402.342500] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2402.342500] env[60764]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2402.342500] env[60764]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2402.342500] env[60764]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2402.342500] env[60764]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2402.342500] env[60764]: ERROR oslo_vmware.rw_handles [ 2402.343149] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Downloaded image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to vmware_temp/c8d4cd49-4943-4584-b519-220c8249dda2/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2402.345303] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Caching image {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2402.345572] env[60764]: DEBUG nova.virt.vmwareapi.vm_util [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Copying Virtual Disk [datastore2] vmware_temp/c8d4cd49-4943-4584-b519-220c8249dda2/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk to [datastore2] vmware_temp/c8d4cd49-4943-4584-b519-220c8249dda2/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk {{(pid=60764) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2402.345874] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9b6b9751-0987-49e5-8ca9-7c88cb447114 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.353655] env[60764]: DEBUG oslo_vmware.api [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Waiting for the task: (returnval){ [ 2402.353655] env[60764]: value = "task-2205076" [ 2402.353655] env[60764]: _type = "Task" [ 2402.353655] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2402.361519] env[60764]: DEBUG oslo_vmware.api [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Task: {'id': task-2205076, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2402.863629] env[60764]: DEBUG oslo_vmware.exceptions [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Fault InvalidArgument not matched. {{(pid=60764) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2402.867041] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Releasing lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2402.867041] env[60764]: ERROR nova.compute.manager [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2402.867041] env[60764]: Faults: ['InvalidArgument'] [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Traceback (most recent call last): [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] yield resources [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] self.driver.spawn(context, instance, image_meta, [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] self._fetch_image_if_missing(context, vi) [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] image_cache(vi, tmp_image_ds_loc) [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] vm_util.copy_virtual_disk( [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] session._wait_for_task(vmdk_copy_task) [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] return self.wait_for_task(task_ref) [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] return evt.wait() [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] result = hub.switch() [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] return self.greenlet.switch() [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] self.f(*self.args, **self.kw) [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] raise exceptions.translate_fault(task_info.error) [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Faults: ['InvalidArgument'] [ 2402.867041] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] [ 2402.867041] env[60764]: INFO nova.compute.manager [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Terminating instance [ 2402.867993] env[60764]: DEBUG oslo_concurrency.lockutils [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquired lock "[datastore2] devstack-image-cache_base/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d.vmdk" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2402.867993] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2402.867993] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-38a5da40-e51c-4297-9f97-00ee4af41a84 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.871281] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquiring lock "refresh_cache-b12b4099-9dd1-4219-823d-79cdef6a4e5e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2402.871572] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquired lock "refresh_cache-b12b4099-9dd1-4219-823d-79cdef6a4e5e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2402.871844] env[60764]: DEBUG nova.network.neutron [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2402.878761] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2402.879069] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60764) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2402.880278] env[60764]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e83a3ffb-f162-4d5f-91c5-09b8fde8166f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.887473] env[60764]: DEBUG oslo_vmware.api [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Waiting for the task: (returnval){ [ 2402.887473] env[60764]: value = "session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5232df1e-620c-ffa3-8379-e19f47ef92c1" [ 2402.887473] env[60764]: _type = "Task" [ 2402.887473] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2402.895096] env[60764]: DEBUG oslo_vmware.api [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Task: {'id': session[52d8d00f-a478-3fe5-99d9-353c09a4f9ae]5232df1e-620c-ffa3-8379-e19f47ef92c1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2402.900775] env[60764]: DEBUG nova.network.neutron [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2402.964846] env[60764]: DEBUG nova.network.neutron [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2402.972885] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Releasing lock "refresh_cache-b12b4099-9dd1-4219-823d-79cdef6a4e5e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2402.973726] env[60764]: DEBUG nova.compute.manager [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2402.973726] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2402.974774] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d447dcb-c98e-45fa-b246-a842fe44c675 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2402.983766] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Unregistering the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2402.983766] env[60764]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9a915aeb-3078-4d00-971d-86867e75948a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2403.013522] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Unregistered the VM {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2403.013737] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Deleting contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2403.013898] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Deleting the datastore file [datastore2] b12b4099-9dd1-4219-823d-79cdef6a4e5e {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2403.014193] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5432c1c5-cf16-438c-b508-4ef79d501874 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2403.019602] env[60764]: DEBUG oslo_vmware.api [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Waiting for the task: (returnval){ [ 2403.019602] env[60764]: value = "task-2205078" [ 2403.019602] env[60764]: _type = "Task" [ 2403.019602] env[60764]: } to complete. {{(pid=60764) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2403.026749] env[60764]: DEBUG oslo_vmware.api [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Task: {'id': task-2205078, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2403.398542] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Preparing fetch location {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2403.398828] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Creating directory with path [datastore2] vmware_temp/23f240ad-be33-44ec-ad7f-ab450419fbff/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2403.399067] env[60764]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0a874ce0-c312-4802-a874-a89ab92992d1 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2403.410452] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Created directory with path [datastore2] vmware_temp/23f240ad-be33-44ec-ad7f-ab450419fbff/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d {{(pid=60764) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2403.410754] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Fetch image to [datastore2] vmware_temp/23f240ad-be33-44ec-ad7f-ab450419fbff/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk {{(pid=60764) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2403.411053] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to [datastore2] vmware_temp/23f240ad-be33-44ec-ad7f-ab450419fbff/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk on the data store datastore2 {{(pid=60764) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2403.412250] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bae1c6e4-1005-46eb-ab82-5d047892ca7a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2403.418581] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-656e4e66-b2fd-4d2b-af9c-9e780bee8554 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2403.427279] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2baa7e00-3c22-480f-a4e7-6b7fbd6f4996 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2403.456368] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f26d45f-324d-493b-9895-9c260029a227 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2403.461692] env[60764]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-75de80a0-caf0-4046-945b-b40d4fce526f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2403.483736] env[60764]: DEBUG nova.virt.vmwareapi.images [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: c6fbe481-a9e0-40d5-9cac-e4645a50be1a] Downloading image file data 04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d to the data store datastore2 {{(pid=60764) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2403.529206] env[60764]: DEBUG oslo_vmware.api [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Task: {'id': task-2205078, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.047955} completed successfully. {{(pid=60764) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2403.529444] env[60764]: DEBUG nova.virt.vmwareapi.ds_util [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Deleted the datastore file {{(pid=60764) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2403.529617] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Deleted contents of the VM from datastore datastore2 {{(pid=60764) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2403.529779] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2403.529939] env[60764]: INFO nova.compute.manager [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Took 0.56 seconds to destroy the instance on the hypervisor. [ 2403.530194] env[60764]: DEBUG oslo.service.loopingcall [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2403.530393] env[60764]: DEBUG nova.compute.manager [-] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Skipping network deallocation for instance since networking was not requested. {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2403.532407] env[60764]: DEBUG nova.compute.claims [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Aborting claim: {{(pid=60764) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2403.533042] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2403.533042] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2403.535658] env[60764]: DEBUG oslo_vmware.rw_handles [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/23f240ad-be33-44ec-ad7f-ab450419fbff/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2403.596264] env[60764]: DEBUG oslo_vmware.rw_handles [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Completed reading data from the image iterator. {{(pid=60764) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2403.596264] env[60764]: DEBUG oslo_vmware.rw_handles [None req-2b4b567c-5028-4f78-a7d4-a1c5e98b0425 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/23f240ad-be33-44ec-ad7f-ab450419fbff/04a11ac6-e7d0-45d1-a0cb-4c22c4396f6d/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60764) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2403.641165] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1210daf7-79d5-42f4-b040-8468222ad0a5 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2403.648624] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74914fe8-0ce4-499a-b830-997c993123b8 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2403.677485] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03d74b7c-9604-4a04-afd2-55e199a3bc92 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2403.684045] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02bb11f4-ba17-4931-91e8-004d10d5c55a {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2403.696414] env[60764]: DEBUG nova.compute.provider_tree [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Inventory has not changed in ProviderTree for provider: 67a94047-1c18-43e8-9b47-05a1d30bcca4 {{(pid=60764) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2403.704736] env[60764]: DEBUG nova.scheduler.client.report [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Inventory has not changed for provider 67a94047-1c18-43e8-9b47-05a1d30bcca4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60764) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2403.718862] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.186s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2403.719377] env[60764]: ERROR nova.compute.manager [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2403.719377] env[60764]: Faults: ['InvalidArgument'] [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Traceback (most recent call last): [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] self.driver.spawn(context, instance, image_meta, [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] self._fetch_image_if_missing(context, vi) [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] image_cache(vi, tmp_image_ds_loc) [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] vm_util.copy_virtual_disk( [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] session._wait_for_task(vmdk_copy_task) [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] return self.wait_for_task(task_ref) [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] return evt.wait() [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] result = hub.switch() [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] return self.greenlet.switch() [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] self.f(*self.args, **self.kw) [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] raise exceptions.translate_fault(task_info.error) [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Faults: ['InvalidArgument'] [ 2403.719377] env[60764]: ERROR nova.compute.manager [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] [ 2403.720329] env[60764]: DEBUG nova.compute.utils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] VimFaultException {{(pid=60764) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2403.721348] env[60764]: DEBUG nova.compute.manager [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Build of instance b12b4099-9dd1-4219-823d-79cdef6a4e5e was re-scheduled: A specified parameter was not correct: fileType [ 2403.721348] env[60764]: Faults: ['InvalidArgument'] {{(pid=60764) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2403.721703] env[60764]: DEBUG nova.compute.manager [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Unplugging VIFs for instance {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2403.721931] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquiring lock "refresh_cache-b12b4099-9dd1-4219-823d-79cdef6a4e5e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2403.722095] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquired lock "refresh_cache-b12b4099-9dd1-4219-823d-79cdef6a4e5e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2403.722257] env[60764]: DEBUG nova.network.neutron [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2403.746026] env[60764]: DEBUG nova.network.neutron [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2403.805793] env[60764]: DEBUG nova.network.neutron [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2403.814408] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Releasing lock "refresh_cache-b12b4099-9dd1-4219-823d-79cdef6a4e5e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2403.814628] env[60764]: DEBUG nova.compute.manager [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60764) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2403.814804] env[60764]: DEBUG nova.compute.manager [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Skipping network deallocation for instance since networking was not requested. {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2403.901654] env[60764]: INFO nova.scheduler.client.report [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Deleted allocations for instance b12b4099-9dd1-4219-823d-79cdef6a4e5e [ 2403.916583] env[60764]: DEBUG oslo_concurrency.lockutils [None req-bce3c1a4-f2ad-49d5-81d0-ebd845e20ad1 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Lock "b12b4099-9dd1-4219-823d-79cdef6a4e5e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 414.104s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2403.916847] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "b12b4099-9dd1-4219-823d-79cdef6a4e5e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 91.642s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2403.917038] env[60764]: INFO nova.compute.manager [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] During sync_power_state the instance has a pending task (spawning). Skip. [ 2403.917215] env[60764]: DEBUG oslo_concurrency.lockutils [None req-8066f492-ec0f-43d5-a260-08f6d2093801 None None] Lock "b12b4099-9dd1-4219-823d-79cdef6a4e5e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2403.917433] env[60764]: DEBUG oslo_concurrency.lockutils [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Lock "b12b4099-9dd1-4219-823d-79cdef6a4e5e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 21.336s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2403.917642] env[60764]: DEBUG oslo_concurrency.lockutils [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquiring lock "b12b4099-9dd1-4219-823d-79cdef6a4e5e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2403.917856] env[60764]: DEBUG oslo_concurrency.lockutils [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Lock "b12b4099-9dd1-4219-823d-79cdef6a4e5e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2403.918030] env[60764]: DEBUG oslo_concurrency.lockutils [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Lock "b12b4099-9dd1-4219-823d-79cdef6a4e5e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2403.919754] env[60764]: INFO nova.compute.manager [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Terminating instance [ 2403.921248] env[60764]: DEBUG oslo_concurrency.lockutils [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquiring lock "refresh_cache-b12b4099-9dd1-4219-823d-79cdef6a4e5e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2403.921638] env[60764]: DEBUG oslo_concurrency.lockutils [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Acquired lock "refresh_cache-b12b4099-9dd1-4219-823d-79cdef6a4e5e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2403.921638] env[60764]: DEBUG nova.network.neutron [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Building network info cache for instance {{(pid=60764) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2403.946123] env[60764]: DEBUG nova.network.neutron [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2404.003324] env[60764]: DEBUG nova.network.neutron [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2404.011955] env[60764]: DEBUG oslo_concurrency.lockutils [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Releasing lock "refresh_cache-b12b4099-9dd1-4219-823d-79cdef6a4e5e" {{(pid=60764) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2404.012360] env[60764]: DEBUG nova.compute.manager [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Start destroying the instance on the hypervisor. {{(pid=60764) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2404.012544] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Destroying instance {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2404.013038] env[60764]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a0dff1f9-fb01-452e-953d-3b61e14b2b2f {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2404.022050] env[60764]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b807e87a-1c61-418f-9132-bdf950d6ca58 {{(pid=60764) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2404.045433] env[60764]: WARNING nova.virt.vmwareapi.vmops [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b12b4099-9dd1-4219-823d-79cdef6a4e5e could not be found. [ 2404.045627] env[60764]: DEBUG nova.virt.vmwareapi.vmops [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Instance destroyed {{(pid=60764) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2404.045799] env[60764]: INFO nova.compute.manager [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Took 0.03 seconds to destroy the instance on the hypervisor. [ 2404.046046] env[60764]: DEBUG oslo.service.loopingcall [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60764) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2404.046313] env[60764]: DEBUG nova.compute.manager [-] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Deallocating network for instance {{(pid=60764) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2404.046409] env[60764]: DEBUG nova.network.neutron [-] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] deallocate_for_instance() {{(pid=60764) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2404.065355] env[60764]: DEBUG nova.network.neutron [-] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Instance cache missing network info. {{(pid=60764) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2404.072546] env[60764]: DEBUG nova.network.neutron [-] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Updating instance_info_cache with network_info: [] {{(pid=60764) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2404.080844] env[60764]: INFO nova.compute.manager [-] [instance: b12b4099-9dd1-4219-823d-79cdef6a4e5e] Took 0.03 seconds to deallocate network for instance. [ 2404.165944] env[60764]: DEBUG oslo_concurrency.lockutils [None req-144200fc-6397-4c54-8b05-13871e8955c6 tempest-ServerShowV247Test-1051045822 tempest-ServerShowV247Test-1051045822-project-member] Lock "b12b4099-9dd1-4219-823d-79cdef6a4e5e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.248s {{(pid=60764) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}}